How to maximize parallel download from S3
Posted
by
StCee
on Server Fault
See other posts from Server Fault
or by StCee
Published on 2012-10-19T13:48:48Z
Indexed on
2012/10/20
11:05 UTC
Read the original article
Hit count: 163
I got a lot of images to load from Amazon S3 on a single page, and sometimes it takes quite some time to load all the images. I heard that splitting the images to load from different sub-domains would help parallel downloads, however what is the actual implementation on that? While it is easy to split for sub-domains like static,image, etc; Should I make like 10 sub-domains (image1, image2...) to load say 100 images? Or is there some clever ways to do?
(By the way I am considering using memcache to cache the S3images; I am not sure if it is possible. I would be grateful for any further comments. Thanks a lot!
© Server Fault or respective owner