I need a crawler and a website with a database.
The crawler has to have the following functions: (writing in perl)
- It has to start with an URL
- It has to download every website
- It has to save the URL, the title and the description in a database (MySQL)
- It has to follow every link mentioned on the website automatically
The website has to have the following features:
- A type-in-field which is able to start the crawler through the typing in of an adress (URL)
- A function to stop or pause the crawler everytime wished
- A function which shows up to 200 search results (URL's, titles) per page. The crawler has to be programmed in a way which make it possible to upgrade the crawler everytime with new functions.
7 freelance font une offre moyenne de $11/heure pour ce travail
I have 70% of the sourcecode working. I work on a similar project on my own i would really like to get paid! And i speak German ;) You can choose my working hours/week as you wish. Look at your inbox