I need a script that I can use to test the perfromance of our proxy servers.
The script should accept the following input parameters:
- a list of several hundred to several thousand URLs
- concurrency setting (how many connections per second (max))
- proxy IP, proxy username, proxy password (optional, without it the connection is without proxy)
- a timeout setting for the connection
The script should try to retrieve the list of URLs (download whole page) with the defined maximum concurrency (i.e. amount od parallel connections to the proxy server)
The script needs to parse the downloaded pages to determine if there was an error. A error will be either a specific HTTP status code or a specific string or if the size of the downloaded page is below a certain size or if the timeout expired. There is also the possibility that a 503 status is sent. This shoudl be counted as well.
The script should produce the following output ater the test is completed:
- Total amount of requests sent
- Total amount of successful requests
- Total amount of errors (string or status code or small size)
- Total amount of requests that got a 503 HTTP status response
- Total amount of timeouts
- Minimum time to complete a request (successful)
- Average time to complete a request (successful)
- Median time to complete a request (successful)
- Maximum time to complete a request (successful)
The script needs to run on Linux. Expect the concurrency to be around 50-100 parallel connections. The script needs to be able to constantly sent the amount of new connections every second, even if the old requests are not completed yet.
Décerné à :
Hi Sir, I'm interested in your project. I have a long experience in Perl and in web scraping (using prixies too). Do not hesitate to ask me for any question Best regards