[upstream] blktests: discontiguous-io.cpp:92:24: error: ‘uintptr_t’ was not declared in this scope

Snippet of test failure

ake[1]: Entering directory '/mnt/tests/gitlab.com/redhat/centos-stream/tests/kernel/kernel-tests/-/archive/main/kernel-tests-main.zip/storage/blktests/nvme/nvme-tcp/blktests/src'
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o loblksize loblksize.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o loop_change_fd loop_change_fd.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o loop_get_status_null loop_get_status_null.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o mount_clear_sock mount_clear_sock.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o nbdsetsize nbdsetsize.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o openclose openclose.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o sg/dxfer-from-dev sg/dxfer-from-dev.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o sg/syzkaller1 sg/syzkaller1.c
cc  -O2 -Wall -Wshadow  -DHAVE_LINUX_BLKZONED_H -o zbdioctl zbdioctl.c
g++  -O2 -std=c++11 -Wall -Wextra -Wshadow -Wno-sign-compare -Werror  -DHAVE_LINUX_BLKZONED_H -o discontiguous-io discontiguous-io.cpp
discontiguous-io.cpp: In function ‘void dumphex(std::ostream&, const void*, size_t)’:
discontiguous-io.cpp:92:24: error: ‘uintptr_t’ was not declared in this scope
   92 |                    << (uintptr_t)a + i << ':';
      |                        ^~~~~~~~~
discontiguous-io.cpp:15:1: note: ‘uintptr_t’ is defined in header ‘<cstdint>’; did you forget to ‘#include <cstdint>’?
   14 | #include <vector>
  +++ |+#include <cstdint>
   15 | 
discontiguous-io.cpp:97:43: error: ‘uint8_t’ was not declared in this scope
   97 |                            << (unsigned)((uint8_t*)a)[j];
      |                                           ^~~~~~~

Test logs on DataWarehouse

https://datawarehouse.cki-project.org/kcidb/tests/6808615

DataWarehouse issue

https://datawarehouse.cki-project.org/issue/1851

In case opening the links above result in a 404 page on DataWarehouse, please make sure you are correctly logged into DataWarehouse via Red Hat SSO.

To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information