The film’s average bitrate: is 62997 kb/s and it’s being sent over TLS encryption. You’re on a non-plus Vero4K, so its ethernet is limited to around 93 Mbits/sec, at best, without protocol overheads. You haven’t said what the laptop is but it’ll have a faster CPU and I’m guessing that it’s probably either on gigabit ethernet or 802.11ac WiFi.
Is this really the problem though? I remember Sam saying that Vero 4k+ shouldn’t really always be needed for 4K files and the gigabit function was nice to have but not a necessity. I don’t mind shelling out for a Vero 4k+ but I want to be sure that it will fix the problem.
Laptop is Lenovo V155
AMD Ryzen 5 3500U 2.1GHz
Wired on what should be a gigabit ethernet connection yes.
If it doesn’t work fast enough, then there will be a reason for this. I can only provide suggestions as to why this is so.
The average bit rate for the rip is 62,997 kbits/sec. That’s an average and the real-world stream will contain peaks that will be higher.
TLS will create an overhead. Not huge, but it won’t help on a smaller CPU.
You’re using WebDAV to get the data. I don’t have any practical experience of WebDAV but it’s unlikely to be particularly efficient.
But I’m guessing that by far the biggest problem is network latency. On a local network, it might take 5 milliseconds to contact your server for the next block, whereas you might be seeing 10x or 20x that on “the cloud”. (And because it’s the cloud, it can be very difficult to know where the actual data server is physically located.) Plus, local storage can often take advantage of read-ahead to help speed things along. All in all, it’s never going to be as efficient as a reading from a LAN-based server.
I would say so too. Cloud storage often does bandwidth throttling so anything above the needed bitrate will be dropped. He should test at what nominal speed he’s able to download the files to get an idea of the maximum bandwidth. This will be unrelated to his ISP given bandwidth, of course.