Skip to content

Long pathnames patch

Josef Möllers requested to merge jmoellers/wget:bsc1181173 into master

I'm submitting this on behalf of a colleague who has some problems with the copyright statement.

When running recursively, wget will verify the length of the whole URL when saving the files. This will make it overwrite files with truncated names, throwing the "The name is too long, ... trying to shorten" messages.

Our test case is:

$ wget --content-disposition -l inf -x -np -r -o wget_log -A '.gz' $URL

where $URL has > (255 - CHOMP_BUFFER) characters, and the length check code checks for the whole $URL, not for each path element.

This is wrong, because, on disk, the filesystem considers each directory and file separately to have a filename limit(*) (obviously).

So, in this patch, I moved the length check code to a separate function and call it from the append_dir_structure() for each path element.

(*) all modern filesystems have a 255-char filename length limit, which makes me wonder if these checks are actually even useful/necessary

Merge request reports