You can write this in either a Linux Shell Script, Perl, or PHP.
I need a script to download a static file from several different servers and measure the download speed. The script should do the following:
1) read a config file in this format: (this must be in a separate file than the script itself)
# specify an arbitrary number of friendly_name and IP pairs. The exact data structure you use doesn't matter.
friendly_names_and_ips = [ ["name1", "[url removed, login to view]"], ["name2", "[url removed, login to view]"] ]
# this URL is the same for all IPs
url = "/some_path/[url removed, login to view]"
2) for each element in friendly_names_and_ips: (in sequence, not in parallel)
- wget "http://$ip:80$url"
- determine the average download speed in KB/s. For example, "wget [url removed, login to view]" shows:
2012-08-01 10:57:36 (91.0 KB/s) - `index.html.1' saved 
---> save 91.0
3) output results in CSV format with these columns, one row per server:
- current datetime (when printing the output, not when the test was run, might might have been a few seconds earlier, but that's OK)
Don't write the column names. Only write the data values.
When I run this script, its only output should be something like this:
2012-08-01 12:12:12,name1,[url removed, login to view],/some_path/[url removed, login to view],91.0
2012-08-01 12:12:12,name2,[url removed, login to view],/some_path/[url removed, login to view],82.0
I'd like this completed within 24 hours of project start. It needs to run on Ubuntu 10.04 LTS. Test it on your own machine, I won't provide a test environment.
Thank you for your time, I look forward to seeing your bid!