Simultaenous URL request
Status: Alpha
Brought to you by:
coroberti
From: Ciaran M. <cmc...@gm...> - 2012-06-27 17:24:32
|
Hi, Curlloader is a great piece of software - keep up the good work! I was wondering if it's possible to fetch a list of URLS simultaneously? I have a script that automates the generation of large amounts of URLs. I'm looking to transfer a known amount of unique data and a known throughput via a proxy device. At the moment, Curlloader processes each URL sequentially for each Client. I'd like it to download each URL for each specified Client, once. E.g. ########## GENERAL SECTION ########## BATCH_NAME=sim_get CLIENTS_NUM_MAX=1000 CLIENTS_NUM_START=1000 CLIENTS_RAMPUP_INC=1 INTERFACE=eth0 NETMASK=255.255.224.0 IP_ADDR_MIN=172.16.47.100 IP_ADDR_MAX=172.16.47.100 CYCLES_NUM=1 URLS_NUM=1 ########## URLs SECTION ############# URL=http://172.16.62.110/cgi-bin/bytegenerator?size=100&seed=1 URL_SHORT_NAME="1" REQUEST_TYPE=GET TRANSFER_LIMIT_RATE=1000 URL=http://172.16.62.110/cgi-bin/bytegenerator?size=100&seed=2 URL_SHORT_NAME="2" REQUEST_TYPE=GET TRANSFER_LIMIT_RATE=1000 URL=http://172.16.62.110/cgi-bin/bytegenerator?size=100&seed=3 URL_SHORT_NAME="3" REQUEST_TYPE=GET TRANSFER_LIMIT_RATE=1000 .. .. .. URL=http://172.16.62.110/cgi-bin/bytegenerator?size=100&seed=1000 URL_SHORT_NAME="1000" REQUEST_TYPE=GET TRANSFER_LIMIT_RATE=1000 Thanks and regards! |