getting the file via network (HTTP)
hello, I need to download the files form HTTP server. I'm using the libCURL and it works fine when I do know the file size (content-length header). But I found some servers do not provide the size of requested document in response headers, so now i'm a bit confused - I need to create some effective routine to store the content from network to memory without any idea about the size.
I guess I need to use std::vector, but as far as I know, it allocates some memory, and when this memory is not enough - it reallocates memory and copies the data from old location to new. This seems not be very good solution (slow?), may be somebody can suggest better solution?
I guess I need to use std::vector, but as far as I know, it allocates some memory, and when this memory is not enough - it reallocates memory and copies the data from old location to new. This seems not be very good solution (slow?), may be somebody can suggest better solution?
