Find Jobs
Hire Freelancers

Perl function to fetch html code

$30-250 USD

Complété
Publié il y a presque 14 ans

$30-250 USD

Payé lors de la livraison
Hi, I need a perl function "grab" that retrieves all URLs in @urls and put their contents into an array (where $array[0] holds the content of $url[0], etc). If timeout is exceeded, content will be "timeout". It's important that this function can very efficiently deal with a large number of URLs. In the past I've used LWP::Parallel::UserAgent to accomplish this. I'd like to use the code as shown in the script example below, so please formulate your function so that I can drop it into the script below and it will work. Thank you. Also, please take a look at the Google answer solution page at [login to view URL] . It shows the solution I've obtained for the same issue back in 2004, using LWP::Parallel::UserAgent. Looking at it might give you some useful pointers. However, the solution on that page doesn't work anymore because of a recently emerged problem with LWP::ParallelUserAgent which is explained at [login to view URL] (it's because of that problem I've posted this project here). THe script will have to run in a [login to view URL] environment and current it seems that LWP::ParallelUserAgent is not working with the pre-installed version of LWP on Bluehost. So it would be preferable to have a solution that does not depend on LWP::ParallelUserAgent at all but still is capable of quickly retrieving a large number of URLs. Thank you. Marc. #!/usr/bin/perl @urls=("[login to view URL]", "[login to view URL]", "[login to view URL]"); $timeout = 20; # each request times out after 20 seconds) @content = &grab(@urls); print "everything grabed"; exit; sub grab { # your grab function here }
N° de projet : 700154

Concernant le projet

11 propositions
Projet à distance
Actif à il y a 14 ans

Cherchez-vous à gagner de l'argent ?

Avantages de faire une offre sur Freelancer

Fixez votre budget et vos délais
Soyez payé pour votre travail
Surlignez votre proposition
Il est gratuit de s'inscrire et de faire des offres sur des travaux
Décerné à :
Avatar de l'utilisateur
Please consider my bid, and see the project clarification board for additional information. I am an experienced developer with over 10 years of professional internet development experience. I have written countless perl scripts for browsing and scraping over my career. I am relatively new to Freelancer.com, which is good for you. In order to stay competitive, I have to lower my bid to make sure that it stands out. I believe that I can competently complete this project, quickly and accurately. If you have any questions feel free to ask. Thank you again for considering my bid for this project.
$30 USD en 1 jour
5,0 (2 commentaires)
1,6
1,6
11 freelances proposent en moyenne $83 USD pour ce travail
Avatar de l'utilisateur
I can do this job for you. See PM for details.
$30 USD en 1 jour
5,0 (147 commentaires)
5,9
5,9
Avatar de l'utilisateur
memory usage could be really high for this project
$120 USD en 4 jours
5,0 (26 commentaires)
5,6
5,6
Avatar de l'utilisateur
i can do this
$30 USD en 0 jour
5,0 (25 commentaires)
5,2
5,2
Avatar de l'utilisateur
We can help in your project, please check PMB and our ratings/reviews to get idea of our experience.
$225 USD en 5 jours
3,7 (13 commentaires)
5,3
5,3
Avatar de l'utilisateur
Placing bid
$30 USD en 15 jours
4,0 (5 commentaires)
3,0
3,0
Avatar de l'utilisateur
Hey Marc, Im new to freelancer but have 13 year Perl experience & botting/scraping. Pls view my PM for more details. Thanks! Dan
$100 USD en 5 jours
5,0 (1 commentaire)
2,0
2,0
Avatar de l'utilisateur
Please see PM. Thanks, R.
$50 USD en 1 jour
5,0 (2 commentaires)
1,7
1,7
Avatar de l'utilisateur
Hi I have 6+ years of experience using Perl (LWP, Object oriented perl, APIs from CPAN), PHP and MySQL to build web applications and implement middle tier as well as backend objects. I feel this project is an ideal opportunity to expand on the solution to build an enterprise wide deployment. I can deliver the solution with necessary design and development documentation as specified.
$150 USD en 4 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
10+ yrs of IT experience with enough exposure to similar perl programs. Please see the PMB for clarifications.
$50 USD en 3 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
I will write a multithreaded, command-line Web client that can fetch data from multiple URLs concurrently. I will write this program quickly and I will provide an option to have it read the URLs from a file (in addition to the ability to simply process an array). The number of sites that the program accesses concurrently, the agent name of the program, and the time-out period will be all configurable. The program will be able to deal with cookies and also with authentication, should that become necessary. Finally, the end user will have the option of writing the output to disk (in addition to simply keeping it in an array). My program will be free of bugs or I will perform any additional work that relates to fixing bugs free of charge. The the program will be a stand-alone program, it will contain a grab function that performs the job that you describe in your RFP. I will use the LWP::UserAgent, threads, and threads::shared modules, all part of core Perl. The only requirement of the program is that Perl be built with threads and that the version of Perl be reasonably recent (2 or 3 years old). Thanks, --Donnie
$100 USD en 1 jour
0,0 (0 commentaires)
0,0
0,0

À propos du client

Drapeau de UNITED STATES
Miami Beach, United States
5,0
84
Méthode de paiement vérifiée
Membre depuis mai 30, 2010

Vérification du client

Merci ! Nous vous avons envoyé un lien par e-mail afin de réclamer votre crédit gratuit.
Une erreur a eu lieu lors de l'envoi de votre e-mail. Veuillez réessayer.
Utilisateurs enregistrés Total des travaux publiés
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Chargement de l'aperçu
Permission donnée pour la géolocalisation.
Votre session de connexion a expiré et vous avez été déconnecté. Veuillez vous connecter à nouveau.