Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add DataLoader.epochs() method and Dataset.to_iter(epochs=) argument #1147

Merged
merged 13 commits into from
Oct 20, 2021

Conversation

rjzamora
Copy link
Collaborator

This PR implements the proposed API change discussed in #1142

@rjzamora rjzamora added the IO label Sep 24, 2021
@rjzamora rjzamora self-assigned this Sep 24, 2021
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 7deea554d70b04468a2a39b12cd2ccc604afdf39, no merge conflicts.
Running as SYSTEM
Setting status of 7deea554d70b04468a2a39b12cd2ccc604afdf39 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3551/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 7deea554d70b04468a2a39b12cd2ccc604afdf39^{commit} # timeout=10
Checking out Revision 7deea554d70b04468a2a39b12cd2ccc604afdf39 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7deea554d70b04468a2a39b12cd2ccc604afdf39 # timeout=10
Commit message: "fix test"
 > git rev-list --no-walk 65aa483797ec5415359bd9f049358afa03b2c8fa # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins149838303830390004.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+80.g7deea55 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+80.g7deea55 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+80.g7deea55 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+80.g7deea55 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.6.0+80.g7deea55 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.6.0+80.g7deea55
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.2.5
Best match: pandas 1.2.5
Processing pandas-1.2.5-py3.8-linux-x86_64.egg
Removing pandas 1.3.3 from easy-install.pth file
Adding pandas 1.2.5 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.2.5-py3.8-linux-x86_64.egg
Searching for distributed==2021.4.1
Best match: distributed 2021.4.1
Removing distributed 2021.7.1 from easy-install.pth file
Adding distributed 2021.4.1 to easy-install.pth file
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for dask==2021.4.1
Best match: dask 2021.4.1
Adding dask 2021.4.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for fsspec==2021.8.1
Best match: fsspec 2021.8.1
Processing fsspec-2021.8.1-py3.8.egg
fsspec 2021.8.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/fsspec-2021.8.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Finished processing dependencies for nvtabular==0.6.0+80.g7deea55
Running black --check
All done! ✨ 🍰 ✨
129 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))

Warning, treated as error:
autodoc: failed to import module 'io.dataset' from module 'nvtabular'; the following exception was raised:
cannot import name 'apply' from 'dask.compatibility' (/usr/local/lib/python3.8/dist-packages/dask/compatibility.py)
make: *** [Makefile:20: html] Error 2
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins4833311513480222943.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 560a44ff0c2f57387de13adff16ad3055cd83a5a, no merge conflicts.
Running as SYSTEM
Setting status of 560a44ff0c2f57387de13adff16ad3055cd83a5a to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3552/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 560a44ff0c2f57387de13adff16ad3055cd83a5a^{commit} # timeout=10
Checking out Revision 560a44ff0c2f57387de13adff16ad3055cd83a5a (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 560a44ff0c2f57387de13adff16ad3055cd83a5a # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk 7deea554d70b04468a2a39b12cd2ccc604afdf39 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins1037215084827211300.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+84.g560a44f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+84.g560a44f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+84.g560a44f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+84.g560a44f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.6.0+84.g560a44f is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.6.0+84.g560a44f
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
Removing pandas 1.2.5 from easy-install.pth file
Adding pandas 1.3.3 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
Adding distributed 2021.7.1 to easy-install.pth file
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for fsspec==2021.8.1
Best match: fsspec 2021.8.1
Processing fsspec-2021.8.1-py3.8.egg
fsspec 2021.8.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/fsspec-2021.8.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.6.0+84.g560a44f
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
..................................................F [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=================================== FAILURES ===================================
___________________________ test_multihot_empty_rows ___________________________

def test_multihot_empty_rows():
    multi_hot = tf.feature_column.categorical_column_with_identity("multihot", 5)
    multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 8, combiner="sum")

    embedding_layer = layers.DenseFeatures([multi_hot_embedding])
    inputs = {
        "multihot": (
            tf.keras.Input(name="multihot__values", shape=(1,), dtype=tf.int64),
            tf.keras.Input(name="multihot__nnzs", shape=(1,), dtype=tf.int64),
        )
    }
    output = embedding_layer(inputs)

    model = tf.keras.Model(inputs=inputs, outputs=output)
    model.compile("sgd", "binary_crossentropy")

    multi_hot_values = np.array([0, 2, 1, 4, 1, 3, 1])
    multi_hot_nnzs = np.array([1, 0, 2, 4, 0])
    x = {"multihot": (multi_hot_values[:, None], multi_hot_nnzs[:, None])}

    multi_hot_embedding_table = embedding_layer.embedding_tables["multihot"].numpy()
    multi_hot_embedding_rows = _compute_expected_multi_hot(
        multi_hot_embedding_table, multi_hot_values, multi_hot_nnzs, "sum"
    )

    y_hat = model(x).numpy()
  np.testing.assert_allclose(y_hat, multi_hot_embedding_rows, rtol=1e-06)

E AssertionError:
E Not equal to tolerance rtol=1e-06, atol=0
E
E Mismatched elements: 1 / 40 (2.5%)
E Max absolute difference: 7.450581e-09
E Max relative difference: 4.697791e-06
E x: array([[-7.085864e-01, -3.664722e-01, 3.241250e-01, 6.299742e-01,
E 8.547377e-01, 9.007017e-02, -1.291421e-01, -2.643898e-01],
E [ 0.000000e+00, 0.000000e+00, 0.000000e+00, 0.000000e+00,...
E y: array([[-7.085864e-01, -3.664722e-01, 3.241250e-01, 6.299742e-01,
E 8.547377e-01, 9.007017e-02, -1.291421e-01, -2.643898e-01],
E [ 0.000000e+00, 0.000000e+00, 0.000000e+00, 0.000000e+00,...

tests/unit/framework_utils/test_tf_layers.py:321: AssertionError
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 331 13 140 11 95% 128, 143-144, 244->246, 256-260, 306-307, 346->350, 347->346, 421, 425-426, 456, 561, 569
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 314, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 117, 155-156
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7775 1533 3133 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.57%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
==== 1 failed, 1533 passed, 10 skipped, 697 warnings in 2195.15s (0:36:35) =====
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins2499993751306157999.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 03895e2fdab2c65decdb1e182f517e16a05b91ee, no merge conflicts.
Running as SYSTEM
Setting status of 03895e2fdab2c65decdb1e182f517e16a05b91ee to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3553/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 03895e2fdab2c65decdb1e182f517e16a05b91ee^{commit} # timeout=10
Checking out Revision 03895e2fdab2c65decdb1e182f517e16a05b91ee (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03895e2fdab2c65decdb1e182f517e16a05b91ee # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk 560a44ff0c2f57387de13adff16ad3055cd83a5a # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins4999703901018820716.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+4.g03895e2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+4.g03895e2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+4.g03895e2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+4.g03895e2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+4.g03895e2 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+4.g03895e2
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.8.1
Best match: fsspec 2021.8.1
Processing fsspec-2021.8.1-py3.8.egg
fsspec 2021.8.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/fsspec-2021.8.1-py3.8.egg
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.7.0+4.g03895e2
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py ..F........................ [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=================================== FAILURES ===================================
____________________ test_dense_embedding_layer[mean-stack] ____________________

aggregation = 'stack', combiner = 'mean'

@pytest.mark.parametrize("aggregation", ["stack", "concat"])
@pytest.mark.parametrize("combiner", ["sum", "mean"])  # TODO: add sqrtn
def test_dense_embedding_layer(aggregation, combiner):
    raw_good_columns = get_good_feature_columns()
    scalar_numeric, vector_numeric, one_hot, multi_hot = raw_good_columns
    one_hot_embedding = tf.feature_column.indicator_column(one_hot)
    multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 8, combiner=combiner)

    # should raise ValueError if passed categorical columns
    with pytest.raises(ValueError):
        embedding_layer = layers.DenseFeatures(raw_good_columns, aggregation=aggregation)

    if aggregation == "stack":
        # can't pass numeric to stack aggregation unless dims are 1
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [
                    scalar_numeric,
                    vector_numeric,
                    one_hot_embedding,
                    multi_hot_embedding,
                ],
                aggregation=aggregation,
            )
        # can't have mismatched dims with stack aggregation
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [one_hot_embedding, multi_hot_embedding], aggregation=aggregation
            )

        # reset b embedding to have matching dims
        multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 100, combiner=combiner)
        cols = [one_hot_embedding, multi_hot_embedding]
    else:
        cols = [scalar_numeric, vector_numeric, one_hot_embedding, multi_hot_embedding]

    embedding_layer = layers.DenseFeatures(cols, aggregation=aggregation)
    inputs = {
        "scalar_continuous": tf.keras.Input(name="scalar_continuous", shape=(1,), dtype=tf.float32),
        "vector_continuous": tf.keras.Input(
            name="vector_continuous__values", shape=(1,), dtype=tf.float32
        ),
        "one_hot": tf.keras.Input(name="one_hot", shape=(1,), dtype=tf.int64),
        "multi_hot": (
            tf.keras.Input(name="multi_hot__values", shape=(1,), dtype=tf.int64),
            tf.keras.Input(name="multi_hot__nnzs", shape=(1,), dtype=tf.int64),
        ),
    }
    if aggregation == "stack":
        inputs.pop("scalar_continuous")
        inputs.pop("vector_continuous")

    output = embedding_layer(inputs)
    model = tf.keras.Model(inputs=inputs, outputs=output)
    model.compile("sgd", "mse")

    # TODO: check for out-of-range categorical behavior
    scalar = np.array([0.1, -0.2, 0.3], dtype=np.float32)
    vector = np.random.randn(3, 128).astype("float32")
    one_hot = np.array([44, 21, 32])
    multi_hot_values = np.array([0, 2, 1, 4, 1, 3, 1])
    multi_hot_nnzs = np.array([1, 2, 4])
    x = {
        "scalar_continuous": scalar[:, None],
        "vector_continuous": vector.flatten()[:, None],
        "one_hot": one_hot[:, None],
        "multi_hot": (multi_hot_values[:, None], multi_hot_nnzs[:, None]),
    }
    if aggregation == "stack":
        x.pop("scalar_continuous")
        x.pop("vector_continuous")

    multi_hot_embedding_table = embedding_layer.embedding_tables["multi_hot"].numpy()
    multi_hot_embedding_rows = _compute_expected_multi_hot(
        multi_hot_embedding_table, multi_hot_values, multi_hot_nnzs, combiner
    )

    # check that shape and values match up
    y_hat = model(x).numpy()
    assert y_hat.shape[0] == 3
    if aggregation == "stack":
        assert len(y_hat.shape) == 3
        # len of columns is 2 because of mh (vals, nnzs) struct
        assert y_hat.shape[1] == (len(x))
        assert y_hat.shape[2] == 100
      np.testing.assert_allclose(y_hat[:, 0], multi_hot_embedding_rows, rtol=1e-05)

E AssertionError:
E Not equal to tolerance rtol=1e-05, atol=0
E
E Mismatched elements: 1 / 300 (0.333%)
E Max absolute difference: 1.4901161e-08
E Max relative difference: 1.0973215e-05
E x: array([[ 8.713624e-02, -5.853207e-02, -5.087129e-02, -1.146480e-02,
E 1.569657e-01, 1.378342e-01, -2.778266e-02, -5.284876e-02,
E 2.679424e-01, 4.617411e-02, -9.085266e-02, -1.140780e-01,...
E y: array([[ 8.713624e-02, -5.853207e-02, -5.087129e-02, -1.146480e-02,
E 1.569657e-01, 1.378342e-01, -2.778266e-02, -5.284876e-02,
E 2.679424e-01, 4.617411e-02, -9.085266e-02, -1.140780e-01,...

tests/unit/framework_utils/test_tf_layers.py:139: AssertionError
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 331 13 140 11 95% 128, 143-144, 244->246, 256-260, 306-307, 346->350, 347->346, 421, 425-426, 456, 561, 569
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 314, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 117, 155-156
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7775 1533 3133 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.57%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
==== 1 failed, 1533 passed, 10 skipped, 683 warnings in 2470.60s (0:41:10) =====
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins4321831072147216540.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit bb29f5d1061a4c2d6857c17b99ed27656fbb7ab4, no merge conflicts.
Running as SYSTEM
Setting status of bb29f5d1061a4c2d6857c17b99ed27656fbb7ab4 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3564/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse bb29f5d1061a4c2d6857c17b99ed27656fbb7ab4^{commit} # timeout=10
Checking out Revision bb29f5d1061a4c2d6857c17b99ed27656fbb7ab4 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bb29f5d1061a4c2d6857c17b99ed27656fbb7ab4 # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk a319501ab7c7d6380c975a50d1760e0f4d7a7c95 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins2181453007388880785.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+6.gbb29f5d -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+6.gbb29f5d -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+6.gbb29f5d -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+6.gbb29f5d -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+6.gbb29f5d is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+6.gbb29f5d
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for fsspec==2021.9.0
Best match: fsspec 2021.9.0
Removing fsspec 2021.8.1 from easy-install.pth file
Adding fsspec 2021.9.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.7.0+6.gbb29f5d
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 331 13 140 11 95% 111, 143-144, 244->246, 256-260, 306-307, 346->350, 347->346, 421, 425-426, 456, 561, 569
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 314, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 117, 155-156
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7775 1533 3133 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.57%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
========= 1534 passed, 10 skipped, 697 warnings in 1932.62s (0:32:12) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins3581008013040148791.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit c513cd22a545051cc4f5b1c46c3d86f393316e0e, no merge conflicts.
Running as SYSTEM
Setting status of c513cd22a545051cc4f5b1c46c3d86f393316e0e to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3574/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse c513cd22a545051cc4f5b1c46c3d86f393316e0e^{commit} # timeout=10
Checking out Revision c513cd22a545051cc4f5b1c46c3d86f393316e0e (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c513cd22a545051cc4f5b1c46c3d86f393316e0e # timeout=10
Commit message: "Merge branch 'epochs-arg' of https://github.com/rjzamora/NVTabular into epochs-arg"
 > git rev-list --no-walk b21c57dce7348c43107760bdc0c68b62f7f98709 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins2619214284882428384.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+8.gc513cd2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+8.gc513cd2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+8.gc513cd2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+8.gc513cd2 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+8.gc513cd2 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+8.gc513cd2
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.9.0
Best match: fsspec 2021.9.0
Adding fsspec 2021.9.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Finished processing dependencies for nvtabular==0.7.0+8.gc513cd2
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.loader.torch
nvtabular/loader/torch.py:92:8: E1123: Unexpected keyword argument 'epochs' in unbound method call (unexpected-keyword-arg)
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 9.99/10 (previous run: 10.00/10, -0.01)

Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins2393362050831856395.sh

Comment on lines +136 to +147
data_loader = DataLoader(
dataset,
cat_names=cat_names,
cont_names=cont_names,
batch_size=batch_size,
label_names=label_name,
shuffle=False,
)

# Convert to iterators and then to DataFrames
df1 = _concat(list(data_loader._buff.itr))
df2 = _concat(list(data_loader.epochs(epochs)._buff.itr))
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jperez999 - Do you like this API any better? Note that data_loader.epochs(n) is used here to create a multi-epoch DataLoader object from the base (single-epoch) DataLoader.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah this is better. It's allowed but not always set to a given number only during iteration. I fell like I can get behind this.

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit e6d290b5295463884e2262083df6d726c24b304a, no merge conflicts.
Running as SYSTEM
Setting status of e6d290b5295463884e2262083df6d726c24b304a to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3575/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse e6d290b5295463884e2262083df6d726c24b304a^{commit} # timeout=10
Checking out Revision e6d290b5295463884e2262083df6d726c24b304a (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e6d290b5295463884e2262083df6d726c24b304a # timeout=10
Commit message: "remove stale epochs kwargs"
 > git rev-list --no-walk c513cd22a545051cc4f5b1c46c3d86f393316e0e # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins2176570803173748388.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+9.ge6d290b -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+9.ge6d290b -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+9.ge6d290b -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+9.ge6d290b -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+9.ge6d290b is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+9.ge6d290b
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for fsspec==2021.9.0
Best match: fsspec 2021.9.0
Adding fsspec 2021.9.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.7.0+9.ge6d290b
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 9.99/10, +0.01)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py .F......................... [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=================================== FAILURES ===================================
____________________ test_dense_embedding_layer[sum-concat] ____________________

aggregation = 'concat', combiner = 'sum'

@pytest.mark.parametrize("aggregation", ["stack", "concat"])
@pytest.mark.parametrize("combiner", ["sum", "mean"])  # TODO: add sqrtn
def test_dense_embedding_layer(aggregation, combiner):
    raw_good_columns = get_good_feature_columns()
    scalar_numeric, vector_numeric, one_hot, multi_hot = raw_good_columns
    one_hot_embedding = tf.feature_column.indicator_column(one_hot)
    multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 8, combiner=combiner)

    # should raise ValueError if passed categorical columns
    with pytest.raises(ValueError):
        embedding_layer = layers.DenseFeatures(raw_good_columns, aggregation=aggregation)

    if aggregation == "stack":
        # can't pass numeric to stack aggregation unless dims are 1
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [
                    scalar_numeric,
                    vector_numeric,
                    one_hot_embedding,
                    multi_hot_embedding,
                ],
                aggregation=aggregation,
            )
        # can't have mismatched dims with stack aggregation
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [one_hot_embedding, multi_hot_embedding], aggregation=aggregation
            )

        # reset b embedding to have matching dims
        multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 100, combiner=combiner)
        cols = [one_hot_embedding, multi_hot_embedding]
    else:
        cols = [scalar_numeric, vector_numeric, one_hot_embedding, multi_hot_embedding]

    embedding_layer = layers.DenseFeatures(cols, aggregation=aggregation)
    inputs = {
        "scalar_continuous": tf.keras.Input(name="scalar_continuous", shape=(1,), dtype=tf.float32),
        "vector_continuous": tf.keras.Input(
            name="vector_continuous__values", shape=(1,), dtype=tf.float32
        ),
        "one_hot": tf.keras.Input(name="one_hot", shape=(1,), dtype=tf.int64),
        "multi_hot": (
            tf.keras.Input(name="multi_hot__values", shape=(1,), dtype=tf.int64),
            tf.keras.Input(name="multi_hot__nnzs", shape=(1,), dtype=tf.int64),
        ),
    }
    if aggregation == "stack":
        inputs.pop("scalar_continuous")
        inputs.pop("vector_continuous")

    output = embedding_layer(inputs)
    model = tf.keras.Model(inputs=inputs, outputs=output)
    model.compile("sgd", "mse")

    # TODO: check for out-of-range categorical behavior
    scalar = np.array([0.1, -0.2, 0.3], dtype=np.float32)
    vector = np.random.randn(3, 128).astype("float32")
    one_hot = np.array([44, 21, 32])
    multi_hot_values = np.array([0, 2, 1, 4, 1, 3, 1])
    multi_hot_nnzs = np.array([1, 2, 4])
    x = {
        "scalar_continuous": scalar[:, None],
        "vector_continuous": vector.flatten()[:, None],
        "one_hot": one_hot[:, None],
        "multi_hot": (multi_hot_values[:, None], multi_hot_nnzs[:, None]),
    }
    if aggregation == "stack":
        x.pop("scalar_continuous")
        x.pop("vector_continuous")

    multi_hot_embedding_table = embedding_layer.embedding_tables["multi_hot"].numpy()
    multi_hot_embedding_rows = _compute_expected_multi_hot(
        multi_hot_embedding_table, multi_hot_values, multi_hot_nnzs, combiner
    )

    # check that shape and values match up
    y_hat = model(x).numpy()
    assert y_hat.shape[0] == 3
    if aggregation == "stack":
        assert len(y_hat.shape) == 3
        # len of columns is 2 because of mh (vals, nnzs) struct
        assert y_hat.shape[1] == (len(x))
        assert y_hat.shape[2] == 100
        np.testing.assert_allclose(y_hat[:, 0], multi_hot_embedding_rows, rtol=1e-05)
        y_c = y_hat[:, 1]

    elif aggregation == "concat":
        assert len(y_hat.shape) == 2
        assert y_hat.shape[1] == 1 + 100 + 8 + 128

        assert (y_hat[:, 108] == scalar).all()
        assert (y_hat[:, 109:] == vector).all()
      np.testing.assert_allclose(y_hat[:, :8], multi_hot_embedding_rows, rtol=1e-05)

E AssertionError:
E Not equal to tolerance rtol=1e-05, atol=0
E
E Mismatched elements: 1 / 24 (4.17%)
E Max absolute difference: 5.9604645e-08
E Max relative difference: 1.1010912e-05
E x: array([[ 0.31704 , -0.58697 , 0.334609, 0.476073, 0.05929 , 0.023979,
E 0.0328 , -0.026317],
E [ 0.162156, -0.378135, -0.323321, -0.333824, 0.829316, 0.482501,...
E y: array([[ 0.31704 , -0.58697 , 0.334609, 0.476073, 0.05929 , 0.023979,
E 0.0328 , -0.026317],
E [ 0.162156, -0.378135, -0.323321, -0.333824, 0.829316, 0.482501,...

tests/unit/framework_utils/test_tf_layers.py:148: AssertionError
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_column_similarity.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/column_similarity.py:109: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[name] = similarities

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 355 13 146 11 95% 129, 144-145, 273->275, 285-289, 335-336, 375->379, 376->375, 450, 454-455, 485, 590, 598
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 308, 344, 359-361, 390-392, 402-410, 413-416
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7799 1533 3139 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.63%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
==== 1 failed, 1533 passed, 10 skipped, 709 warnings in 1915.33s (0:31:55) =====
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins4801770444979659744.sh

@rjzamora rjzamora changed the title Add epochs= argument to DataLoader and Dataset.to_iter Add DataLoader.epochs() method and Dataset.to_iter Sep 28, 2021
@rjzamora rjzamora changed the title Add DataLoader.epochs() method and Dataset.to_iter Add DataLoader.epochs() method and Dataset.to_iter(epochs=) argument Sep 28, 2021
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 43649ad8d8cc29db6ee67c0ae2a078a9e7c983f9, no merge conflicts.
Running as SYSTEM
Setting status of 43649ad8d8cc29db6ee67c0ae2a078a9e7c983f9 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3576/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 43649ad8d8cc29db6ee67c0ae2a078a9e7c983f9^{commit} # timeout=10
Checking out Revision 43649ad8d8cc29db6ee67c0ae2a078a9e7c983f9 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 43649ad8d8cc29db6ee67c0ae2a078a9e7c983f9 # timeout=10
Commit message: "improve test"
 > git rev-list --no-walk e6d290b5295463884e2262083df6d726c24b304a # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins5326536933526992640.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.1.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+10.g43649ad -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+10.g43649ad -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+10.g43649ad -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+10.g43649ad -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+10.g43649ad is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+10.g43649ad
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.1.0
Best match: setuptools 58.1.0
Adding setuptools 58.1.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for fsspec==2021.9.0
Best match: fsspec 2021.9.0
Adding fsspec 2021.9.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.7.0+10.g43649ad
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 355 13 146 11 95% 129, 144-145, 273->275, 285-289, 335-336, 375->379, 376->375, 450, 454-455, 485, 590, 598
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 308, 344, 359-361, 390-392, 402-410, 413-416
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7799 1533 3139 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.63%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
========= 1534 passed, 10 skipped, 697 warnings in 1932.30s (0:32:12) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins6322481766488039653.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit d5dd74e851345cc932d9a350efcfd745c3db9e31, no merge conflicts.
Running as SYSTEM
Setting status of d5dd74e851345cc932d9a350efcfd745c3db9e31 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3583/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse d5dd74e851345cc932d9a350efcfd745c3db9e31^{commit} # timeout=10
Checking out Revision d5dd74e851345cc932d9a350efcfd745c3db9e31 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d5dd74e851345cc932d9a350efcfd745c3db9e31 # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk dd915c9a242e2e774826ace68ccbdd64883e6d85 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins650892593572642154.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.2.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+13.gd5dd74e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+13.gd5dd74e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+13.gd5dd74e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+13.gd5dd74e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+13.gd5dd74e is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+13.gd5dd74e
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.2.0
Best match: setuptools 58.2.0
Adding setuptools 58.2.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for fsspec==2021.9.0
Best match: fsspec 2021.9.0
Adding fsspec 2021.9.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Finished processing dependencies for nvtabular==0.7.0+13.gd5dd74e
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 355 13 146 11 95% 129, 144-145, 273->275, 285-289, 335-336, 375->379, 376->375, 450, 454-455, 485, 590, 598
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 308, 344, 359-361, 390-392, 402-410, 413-416
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7799 1533 3139 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.63%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
========= 1534 passed, 10 skipped, 697 warnings in 1560.90s (0:26:00) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins2278666542669344749.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 8c0279f623eed9d1c387636b07893f6e9b38573f, no merge conflicts.
Running as SYSTEM
Setting status of 8c0279f623eed9d1c387636b07893f6e9b38573f to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3586/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 8c0279f623eed9d1c387636b07893f6e9b38573f^{commit} # timeout=10
Checking out Revision 8c0279f623eed9d1c387636b07893f6e9b38573f (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8c0279f623eed9d1c387636b07893f6e9b38573f # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk 9c3ae47af237a9e7c566155b9d698fe01fa1164f # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins4287104156267893628.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.2.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.8.0)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+16.g8c0279f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+16.g8c0279f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+16.g8c0279f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+16.g8c0279f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+16.g8c0279f is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+16.g8c0279f
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.2.0
Best match: setuptools 58.2.0
Adding setuptools 58.2.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for fsspec==2021.9.0
Best match: fsspec 2021.9.0
Adding fsspec 2021.9.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.7.0+16.g8c0279f
Running black --check
All done! ✨ 🍰 ✨
130 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.4.0
collected 1543 items / 1 skipped / 1542 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 20%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 31%]
tests/unit/framework_utils/test_tf_layers.py ..F........................ [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 55%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 69%]
........................................................................ [ 74%]
................................................. [ 77%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 84%]
........................................................................ [ 89%]
.......................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=================================== FAILURES ===================================
____________________ test_dense_embedding_layer[mean-stack] ____________________

aggregation = 'stack', combiner = 'mean'

@pytest.mark.parametrize("aggregation", ["stack", "concat"])
@pytest.mark.parametrize("combiner", ["sum", "mean"])  # TODO: add sqrtn
def test_dense_embedding_layer(aggregation, combiner):
    raw_good_columns = get_good_feature_columns()
    scalar_numeric, vector_numeric, one_hot, multi_hot = raw_good_columns
    one_hot_embedding = tf.feature_column.indicator_column(one_hot)
    multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 8, combiner=combiner)

    # should raise ValueError if passed categorical columns
    with pytest.raises(ValueError):
        embedding_layer = layers.DenseFeatures(raw_good_columns, aggregation=aggregation)

    if aggregation == "stack":
        # can't pass numeric to stack aggregation unless dims are 1
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [
                    scalar_numeric,
                    vector_numeric,
                    one_hot_embedding,
                    multi_hot_embedding,
                ],
                aggregation=aggregation,
            )
        # can't have mismatched dims with stack aggregation
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [one_hot_embedding, multi_hot_embedding], aggregation=aggregation
            )

        # reset b embedding to have matching dims
        multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 100, combiner=combiner)
        cols = [one_hot_embedding, multi_hot_embedding]
    else:
        cols = [scalar_numeric, vector_numeric, one_hot_embedding, multi_hot_embedding]

    embedding_layer = layers.DenseFeatures(cols, aggregation=aggregation)
    inputs = {
        "scalar_continuous": tf.keras.Input(name="scalar_continuous", shape=(1,), dtype=tf.float32),
        "vector_continuous": tf.keras.Input(
            name="vector_continuous__values", shape=(1,), dtype=tf.float32
        ),
        "one_hot": tf.keras.Input(name="one_hot", shape=(1,), dtype=tf.int64),
        "multi_hot": (
            tf.keras.Input(name="multi_hot__values", shape=(1,), dtype=tf.int64),
            tf.keras.Input(name="multi_hot__nnzs", shape=(1,), dtype=tf.int64),
        ),
    }
    if aggregation == "stack":
        inputs.pop("scalar_continuous")
        inputs.pop("vector_continuous")

    output = embedding_layer(inputs)
    model = tf.keras.Model(inputs=inputs, outputs=output)
    model.compile("sgd", "mse")

    # TODO: check for out-of-range categorical behavior
    scalar = np.array([0.1, -0.2, 0.3], dtype=np.float32)
    vector = np.random.randn(3, 128).astype("float32")
    one_hot = np.array([44, 21, 32])
    multi_hot_values = np.array([0, 2, 1, 4, 1, 3, 1])
    multi_hot_nnzs = np.array([1, 2, 4])
    x = {
        "scalar_continuous": scalar[:, None],
        "vector_continuous": vector.flatten()[:, None],
        "one_hot": one_hot[:, None],
        "multi_hot": (multi_hot_values[:, None], multi_hot_nnzs[:, None]),
    }
    if aggregation == "stack":
        x.pop("scalar_continuous")
        x.pop("vector_continuous")

    multi_hot_embedding_table = embedding_layer.embedding_tables["multi_hot"].numpy()
    multi_hot_embedding_rows = _compute_expected_multi_hot(
        multi_hot_embedding_table, multi_hot_values, multi_hot_nnzs, combiner
    )

    # check that shape and values match up
    y_hat = model(x).numpy()
    assert y_hat.shape[0] == 3
    if aggregation == "stack":
        assert len(y_hat.shape) == 3
        # len of columns is 2 because of mh (vals, nnzs) struct
        assert y_hat.shape[1] == (len(x))
        assert y_hat.shape[2] == 100
      np.testing.assert_allclose(y_hat[:, 0], multi_hot_embedding_rows, rtol=1e-05)

E AssertionError:
E Not equal to tolerance rtol=1e-05, atol=0
E
E Mismatched elements: 1 / 300 (0.333%)
E Max absolute difference: 1.4901161e-08
E Max relative difference: 1.294951e-05
E x: array([[ 1.611694e-01, 9.836995e-02, -1.515752e-01, -2.095548e-01,
E -1.804091e-01, -3.107491e-01, -1.189290e-01, -8.672244e-02,
E 6.637913e-02, -1.415611e-01, -1.516651e-01, -1.109596e-01,...
E y: array([[ 1.611694e-01, 9.836995e-02, -1.515752e-01, -2.095548e-01,
E -1.804091e-01, -3.107491e-01, -1.189290e-01, -8.672244e-02,
E 6.637913e-02, -1.415611e-01, -1.516651e-01, -1.109596e-01,...

tests/unit/framework_utils/test_tf_layers.py:139: AssertionError
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_column_similarity.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/column_similarity.py:109: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[name] = similarities

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 217 17 107 20 89% 46->62, 49, 51, 53-56, 58, 98->114, 106, 110, 156, 183, 269->276, 272->274, 283, 300->305, 303->305, 316, 340, 347, 356, 359, 364->363
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 355 13 146 11 95% 129, 144-145, 273->275, 285-289, 335-336, 375->379, 376->375, 450, 454-455, 485, 590, 598
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 308, 344, 359-361, 390-392, 402-410, 413-416
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 22 0 0 0 100%
nvtabular/ops/add_metadata.py 9 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 622 66 332 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 704, 722, 758, 836-837, 852-856, 857->821, 875, 883, 890->exit, 914, 917->920, 972, 977, 999->1003, 1005->962, 1011-1014, 1026, 1030, 1032, 1039, 1044-1047, 1125, 1127, 1197->1220, 1203->1220, 1221-1226, 1263, 1282->1287, 1286, 1296->1293, 1301->1293, 1308, 1311, 1319-1329
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 1 14 1 98% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7799 1533 3139 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.63%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
==== 1 failed, 1533 passed, 10 skipped, 709 warnings in 1550.79s (0:25:50) =====
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins8747499485828448747.sh

@benfred benfred linked an issue Oct 6, 2021 that may be closed by this pull request
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 1201f11f5145fe1603a8d0869fec3a6f7e6d9b1a, no merge conflicts.
Running as SYSTEM
Setting status of 1201f11f5145fe1603a8d0869fec3a6f7e6d9b1a to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3642/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 1201f11f5145fe1603a8d0869fec3a6f7e6d9b1a^{commit} # timeout=10
Checking out Revision 1201f11f5145fe1603a8d0869fec3a6f7e6d9b1a (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1201f11f5145fe1603a8d0869fec3a6f7e6d9b1a # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk 977355078e04349112f16d6c8c78c769c82abe1f # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins1071359933304289213.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.2.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.8.0)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+28.g1201f11 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+28.g1201f11 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+28.g1201f11 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+28.g1201f11 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+28.g1201f11 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+28.g1201f11
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.2.0
Best match: setuptools 58.2.0
Adding setuptools 58.2.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.10.1
Best match: fsspec 2021.10.1
Adding fsspec 2021.10.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.7.0+28.g1201f11
Running black --check
All done! ✨ 🍰 ✨
138 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: forked-1.3.0, cov-3.0.0, xdist-2.4.0
collected 1559 items / 1 skipped / 1558 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 19%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 30%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
................................................... [ 35%]
tests/unit/framework_utils/test_torch_layers.py . [ 35%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 40%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 42%]
........................................................ [ 46%]
tests/unit/ops/test_categorify.py ...................................... [ 48%]
............................................................... [ 52%]
tests/unit/ops/test_column_similarity.py ........................ [ 54%]
tests/unit/ops/test_fill.py ............................................ [ 57%]
........ [ 57%]
tests/unit/ops/test_hash_bucket.py ......................... [ 59%]
tests/unit/ops/test_join.py ............................................ [ 62%]
........................................................................ [ 66%]
................................ [ 68%]
tests/unit/ops/test_lambda.py .... [ 69%]
tests/unit/ops/test_normalize.py ....................................... [ 71%]
[ 71%]
tests/unit/ops/test_ops.py ............................................. [ 74%]
...................... [ 75%]
tests/unit/ops/test_ops_schema.py ...................................... [ 78%]
........................................................................ [ 82%]
........................................................................ [ 87%]
....................................... [ 89%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_categorify.py: 2 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_fill.py: 24 warnings
tests/unit/ops/test_normalize.py: 26 warnings
tests/unit/ops/test_ops.py: 3 warnings
tests/unit/ops/test_target_encode.py: 21 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_categorify.py::test_categorify_counts[True-True]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/categorify.py:1002: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df_0[name_count] = null_size

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_join.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 216 17 105 19 89% 46->61, 49-50, 52-55, 57, 97->113, 105, 109, 155, 182, 268->275, 271->273, 282, 299->304, 302->304, 315, 339, 346, 355, 358, 363->362
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 355 13 146 11 95% 129, 144-145, 273->275, 285-289, 335-336, 375->379, 376->375, 450, 454-455, 485, 590, 598
nvtabular/loader/tensorflow.py 164 22 52 7 86% 57, 65-68, 83, 97, 307, 343, 358-360, 389-391, 401-409, 412-415
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 17 2 2 1 84% 32, 46
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 628 66 334 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 705, 722, 758, 835-836, 851-855, 856->820, 874, 882, 889->exit, 913, 916->919, 974, 979, 1000->1005, 1007->961, 1013-1016, 1028, 1032, 1034, 1041, 1046-1049, 1128, 1130, 1200->1223, 1206->1223, 1224-1229, 1266, 1285->1290, 1289, 1299->1296, 1304->1296, 1311, 1314, 1322-1332
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 129 4 78 5 96% 73, 84, 94->96, 106->111, 141, 145
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 0 14 0 100%
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 154 11 66 4 91% 168->172, 176->185, 233-234, 237-238, 250-256, 347->350, 363
nvtabular/ops/value_counts.py 31 1 6 2 92% 41->39, 49
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7856 1536 3155 349 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.73%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:522: not working correctly in ci environment
========= 1550 passed, 10 skipped, 686 warnings in 1653.77s (0:27:33) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins7387044484471872599.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1147 of commit 94d810e540828393a1d407f550ac642d2fbd302a, no merge conflicts.
Running as SYSTEM
Setting status of 94d810e540828393a1d407f550ac642d2fbd302a to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3644/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1147/*:refs/remotes/origin/pr/1147/* # timeout=10
 > git rev-parse 94d810e540828393a1d407f550ac642d2fbd302a^{commit} # timeout=10
Checking out Revision 94d810e540828393a1d407f550ac642d2fbd302a (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 94d810e540828393a1d407f550ac642d2fbd302a # timeout=10
Commit message: "Merge branch 'main' into epochs-arg"
 > git rev-list --no-walk 6a0f23b3eb784551e428d673430d4e4e365500e2 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins5972456813095336820.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.2.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.8.0)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+30.g94d810e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+30.g94d810e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+30.g94d810e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.7.0+30.g94d810e -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.7.0+30.g94d810e is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.7.0+30.g94d810e
Searching for protobuf==3.18.0
Best match: protobuf 3.18.0
Adding protobuf 3.18.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Adding dask 2021.7.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.2.0
Best match: setuptools 58.2.0
Adding setuptools 58.2.0 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for fsspec==2021.10.1
Best match: fsspec 2021.10.1
Adding fsspec 2021.10.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for packaging==21.0
Best match: packaging 21.0
Adding packaging 21.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Finished processing dependencies for nvtabular==0.7.0+30.g94d810e
Running black --check
All done! ✨ 🍰 ✨
138 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:497:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: forked-1.3.0, cov-3.0.0, xdist-2.4.0
collected 1559 items / 1 skipped / 1558 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.................ssssssss............................................... [ 19%]
........ [ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................... [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 30%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
................................................... [ 35%]
tests/unit/framework_utils/test_torch_layers.py . [ 35%]
tests/unit/loader/test_dataloader_backend.py .... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 40%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 42%]
........................................................ [ 46%]
tests/unit/ops/test_categorify.py ...................................... [ 48%]
............................................................... [ 52%]
tests/unit/ops/test_column_similarity.py ........................ [ 54%]
tests/unit/ops/test_fill.py ............................................ [ 57%]
........ [ 57%]
tests/unit/ops/test_hash_bucket.py ......................... [ 59%]
tests/unit/ops/test_join.py ............................................ [ 62%]
........................................................................ [ 66%]
................................ [ 68%]
tests/unit/ops/test_lambda.py .... [ 69%]
tests/unit/ops/test_normalize.py ....................................... [ 71%]
[ 71%]
tests/unit/ops/test_ops.py ............................................. [ 74%]
...................... [ 75%]
tests/unit/ops/test_ops_schema.py ...................................... [ 78%]
........................................................................ [ 82%]
........................................................................ [ 87%]
....................................... [ 89%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_categorify.py: 2 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_fill.py: 24 warnings
tests/unit/ops/test_normalize.py: 26 warnings
tests/unit/ops/test_ops.py: 3 warnings
tests/unit/ops/test_target_encode.py: 21 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:516: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_categorify.py::test_categorify_counts[True-True]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/categorify.py:1002: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df_0[name_count] = null_size

tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_fill.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_join.py: 80 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 14 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/frame.py:3641: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 216 14 105 17 90% 46->61, 52-55, 97->113, 105, 109, 155, 182, 268->275, 271->273, 282, 299->304, 302->304, 315, 339, 346, 355, 358, 363->362
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 290 55 144 23 79% 36-40, 45-47, 53-63, 70-71, 114-116, 121-124, 128-133, 140, 159, 170, 176, 181->183, 194, 217-220, 259->261, 268, 271, 277, 293, 300, 331->336, 334, 337, 340->344, 377, 388-391, 417-420, 450, 454, 495, 519, 521, 528
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 72 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 361 44 172 29 85% 47-48, 264, 266, 279, 304-318, 441->515, 446-449, 454->464, 471->469, 472->476, 489->493, 504, 515->524, 575-576, 577->581, 629, 752, 754, 756, 762, 766-768, 770, 830-831, 858, 865-866, 872, 878, 975-976, 1094-1099, 1105, 1117-1118, 1206
nvtabular/io/dataset_engine.py 31 1 4 0 97% 48
nvtabular/io/fsspec_utils.py 117 103 64 0 8% 26-27, 42-98, 103-135, 151-191, 213-263, 268-284, 288-290, 304-315
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 569 31 188 28 92% 34-35, 58, 80->103, 98, 123, 152-153, 170->195, 181->195, 232-240, 260, 266, 284->286, 300, 318->328, 321, 370->382, 374, 496-501, 539-544, 660->667, 728->733, 734-735, 855, 859, 863, 869, 901, 918, 922, 929->931, 1039->exit, 1049->1054, 1059->1069, 1074, 1096, 1123
nvtabular/io/shuffle.py 31 7 16 4 72% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 185 13 74 5 92% 25-26, 52, 80, 126, 129, 213, 222, 225, 268, 300-302
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 355 13 146 11 95% 129, 144-145, 273->275, 285-289, 335-336, 375->379, 376->375, 450, 454-455, 485, 590, 598
nvtabular/loader/tensorflow.py 164 22 52 7 86% 57, 65-68, 83, 97, 307, 343, 358-360, 389-391, 401-409, 412-415
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 17 2 2 1 84% 32, 46
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 628 66 334 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 359, 370->374, 378-385, 467-468, 493-494, 611, 705, 722, 758, 835-836, 851-855, 856->820, 874, 882, 889->exit, 913, 916->919, 974, 979, 1000->1005, 1007->961, 1013-1016, 1028, 1032, 1034, 1041, 1046-1049, 1128, 1130, 1200->1223, 1206->1223, 1224-1229, 1266, 1285->1290, 1289, 1299->1296, 1304->1296, 1311, 1314, 1322-1332
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 129 4 78 5 96% 73, 84, 94->96, 106->111, 141, 145
nvtabular/ops/hash_bucket.py 41 2 20 2 93% 72, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 92 18 36 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 66 0 14 0 100%
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 154 11 66 4 91% 168->172, 176->185, 233-234, 237-238, 250-256, 347->350, 363
nvtabular/ops/value_counts.py 31 1 6 2 92% 41->39, 49
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 240 18 116 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 453-458
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 503, 515

TOTAL 7856 1533 3155 347 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.78%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:581: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:522: not working correctly in ci environment
========= 1550 passed, 10 skipped, 700 warnings in 1804.99s (0:30:04) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins388328227613537736.sh

@jperez999 jperez999 merged commit a8f15a2 into NVIDIA-Merlin:main Oct 20, 2021
@rjzamora rjzamora deleted the epochs-arg branch October 22, 2021 18:26
mikemckiernan pushed a commit that referenced this pull request Nov 24, 2022
…1147)

* add epoochs arg to dataloaders

* fix test

* add dedicated  method that returns a multi-epoch dataloader

* remove stale epochs kwargs

* improve test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Long time for KerasSequenceLoader to "reload" after each epoch
3 participants