6.2.0 Regression with Intel GVT-g
Host environment
- Operating system: ArchLinux
- OS/kernel version: 5.15.8-arch1-1 (custom patches for gvt-g)
- Architecture: x86_64
- QEMU flavor: qemu-system-x86_64
- QEMU version: 6.2.0
- QEMU command line: libvirt
/usr/bin/qemu-system-x86_64 \ # ... -device vfio-pci,id=hostdev0,sysfsdev=/sys/bus/mdev/devices/$GVT_UUID,display=on,bus=pci.5,addr=0x0 \ -set device.hostdev0.x-igd-opregion=on \ -set device.hostdev0.display=on \ -set device.hostdev0.ramfb=on \ -set device.hostdev0.driver=vfio-pci-nohotplug \ -set device.hostdev0.romfile=/vms/win11-cefet/vbios_gvt_uefi.rom
Emulated/Virtualized environment
- Operating system: Windows 11
- Architecture: x86_64
Description of problem
Until version 6.1.0 the Intel GVT-g graphics passtrought was working flawless. But, since the version 6.2.0 the machine with the exact same configuration is not working anymore, presenting an error that the graphics device was not found.
qemu-system-x86_64: -set device.hostdev0.x-igd-opregion=on: there is no device "hostdev0" defined
Downgrade to 6.1.0 fixes the problem.
Steps to reproduce
- Create a virtual machine with GVT-g
- Try to run the machine.
Edited by Julio Campagnolo