I should qualify that by saying I really don't know that there isn't.
I could very well be wrong about FileZilla and large files.
If that's the case, is there anything I can do?
Server is likely to be OK - ftp.mirror.ac.uk
I've now come across mentions that there's a general 2GB limit
when using FTP - but haven't been able to find out much about it.
Does anyone know?
A lot of operating systems and filesystems run into trouble
with >2GB files; they use 32-bit words for filesizes.
Just a guess ....
It may be that the server resumes from byte 2^32-1 (2147483647)
whenever FileZilla requests a restart at a higher byte than that.
If that's the case, once your local filesize is larger than 2GB,
you will always see something like
Command: REST [number larger than 2147483647]
Response: 350 Restarting at 2147483647. Send RETR to initiate transfer
in the logs. Unfortunately, if that turns out to be the case, I
don't know what you might be able to do about it.
The best reference I can give to these sorts of problems is part
of a developers' list thread,
<
http://curl.haxx.se/mail/lib-2003-12/0091.html>. I don't see any
resolution there, but it does list serveral screwy responses from
several servers when trying a REST over 2GB.