Storage blktests - nvme-tcp: nvme/022: lr : check_flush_dependency+0x138/0x140
Snippet of test failure
# https://beaker-archive.host.prod.eng.bos.redhat.com/beaker-logs/2022/06/66846/6684667/12080370/145553440/taskout.log
>>> 2022-06-02 22:41:21 | Start to run test case nvme-tcp: /mnt/tests/gitlab.com/cki-project/kernel-tests/-/archive/main/kernel-tests-main.zip/storage/blktests/nvme/nvme-tcp/blktests/tests/nvme/022 ...
nvme/022 (test NVMe reset command on NVMeOF file-backed ns)
nvme/022 (test NVMe reset command on NVMeOF file-backed ns) [failed]
runtime ... 2.109s
something found in dmesg:
[ 5822.968453] run blktests nvme/022 at 2022-06-02 22:41:21
[ 5823.004661] nvmet: adding nsid 1 to subsystem blktests-subsystem-1
[ 5823.011777] nvmet_tcp: enabling port 0 (127.0.0.1:4420)
[ 5823.018735] nvmet: creating nvm controller 1 for subsystem blktests-subsystem-1 for NQN nqn.2014-08.org.nvmexpress:uuid:cff5fd96-93bd-11e9-a78f-3c18a00c1956.
[ 5823.019318] nvme nvme0: creating 32 I/O queues.
[ 5823.023805] nvme nvme0: mapped 32/0/0 default/read/poll queues.
[ 5823.035911] nvme nvme0: new ctrl: NQN "blktests-subsystem-1", addr 127.0.0.1:4420
[ 5824.073657] nvme nvme0: resetting controller
[ 5824.196046] nvmet: creating nvm controller 2 for subsystem blktests-subsystem-1 for NQN nqn.2014-08.org.nvmexpress:uuid:cff5fd96-93bd-11e9-a78f-3c18a00c1956.
[ 5824.196612] nvme nvme0: creating 32 I/O queues.
...
(See '/mnt/tests/gitlab.com/cki-project/kernel-tests/-/archive/main/kernel-tests-main.zip/storage/blktests/nvme/nvme-tcp/blktests/results/nodev/nvme/022.dmesg' for the entire message)
>>> 2022-06-02 22:41:24 | End nvme-tcp: /mnt/tests/gitlab.com/cki-project/kernel-tests/-/archive/main/kernel-tests-main.zip/storage/blktests/nvme/nvme-tcp/blktests/tests/nvme/022 | FAIL
# https://s3.us-east-1.amazonaws.com/arr-cki-prod-datawarehouse-public/datawarehouse-public/2022/06/02/554671920/redhat:554671920_aarch64/tests/Storage_blktests_nvme_tcp/12080370_aarch64_1_022.dmesg
[ 5824.821534] Hardware name: Lenovo HR330A 7X33CTO1WW /FALCON , BIOS hve104q-1.14 06/25/2020
[ 5824.831347] Workqueue: nvmet-wq nvmet_tcp_release_queue_work [nvmet_tcp]
[ 5824.838040] pstate: 004000c5 (nzcv daIF +PAN -UAO -TCO -DIT -SSBS BTYPE=--)
[ 5824.844989] pc : check_flush_dependency+0x138/0x140
[ 5824.849855] lr : check_flush_dependency+0x138/0x140
[ 5824.854721] sp : ffff80000ee8bc30
[ 5824.858022] x29: ffff80000ee8bc30 x28: 00000000000000e0 x27: ffff009f6d620768
[ 5824.865146] x26: ffff80000812c3ec x25: ffff0009f0248e00 x24: fffffbffef8c6d00
[ 5824.872269] x23: ffff80000ee8bcd0 x22: ffff00087db78000 x21: ffff00087c288600
[ 5824.879392] x20: ffff00087dd0fc00 x19: ffff80000141bfcc x18: 0000000000000002
[ 5824.886515] x17: ffff80000a0201c8 x16: 00000000000000cc x15: ffff8000088a0cdc
[ 5824.893638] x14: ffff8000091aa698 x13: 000000000000004e x12: 0000000000000018
[ 5824.900761] x11: 0000000000000000 x10: 0000000000000027 x9 : 0000000000000023
[ 5824.907884] x8 : ffff009f6d610490 x7 : 7f7f7f7f7f7f7f7f x6 : 0000000000000020
[ 5824.915007] x5 : ffff80000ee8b6f7 x4 : ffff0a00ffffff04 x3 : ffff0a00ffffff05
[ 5824.922131] x2 : 4000000100000473 x1 : 0000000100000473 x0 : 0000000000000092
[ 5824.929255] Call trace:
[ 5824.931688] check_flush_dependency+0x138/0x140
[ 5824.936207] start_flush_work+0xd8/0x2ac
[ 5824.940117] __cancel_work_timer+0x128/0x1b8
[ 5824.944374] cancel_work_sync+0x20/0x30
[ 5824.948197] nvmet_tcp_release_queue_work+0x80/0x2d4 [nvmet_tcp]
[ 5824.954194] process_one_work+0x1e0/0x40c
[ 5824.958190] worker_thread+0x1e4/0x3f8
[ 5824.961927] kthread+0xd4/0x558
[ 5824.965056] ret_from_fork+0x10/0x20
[ 5824.968620] ---[ end trace 0000000000000000 ]---
Test logs on DataWarehouse
DataWarehouse issue
-
DW issue : https://datawarehouse.cki-project.org/issue/1284
-
Regex
: https://datawarehouse.cki-project.org/issue/-/regex/1238- Text Match :
lr\s*:\s*check_flush_dependency\+0x\S+\/0x\S+
- (Log) File Name Match :
022.dmesg
- Test Name Match :
Storage blktests - nvme-tcp
- KPET Tree Name Match :
(upstream|rawhide)
- Text Match :
-
Additional details
N/A
In case opening the links above result in a 404 page on DataWarehouse, please make sure you are correctly logged into DataWarehouse via Red Hat SSO.