
Complété
Publié
Payé lors de la livraison
I have 285 public-facing project pages and need the same 26 data points lifted from each one. You will receive a spreadsheet with every URL and a clear field dictionary so you can move straight to extraction with Python, BeautifulSoup, Scrapy, or whichever tool chain you prefer. The finished dataset must come back as a single Excel/CSV file. Before you hand it over, give it a quick polish: apply basic, uniform formatting, drop any duplicates, and make sure each column lines up with the field names I supply. No heavy ETL work—just that first-pass cleanup so I can analyse the file immediately. Deliverables • CSV (or XLSX) containing 13 columns × ~285 rows, fully populated where data exists • Basic formatting and de-duplication applied • Short note flagging any URLs or fields that could not be captured A quick turnaround is ideal; the job should be straightforward for anyone comfortable with web scraping and light data wrangling. We will provide the file for review to applicants.
N° de projet : 40251214
164 propositions
Projet à distance
Actif à il y a 16 jours
Fixez votre budget et vos délais
Soyez payé pour votre travail
Surlignez votre proposition
Il est gratuit de s'inscrire et de faire des offres sur des travaux
164 freelances proposent en moyenne $117 AUD pour ce travail

Hi, I am good at Data Scraping using python and Excel VBA. Please share the sample link. I can extract the data 1) Either using Python OR 2) Excel Power Query to fetch the data in minutes and then clean up the data. Looking forward to work on the project. -Chandra
$100 AUD en 1 jour
8,5
8,5

Hi. Do all of the URLs have those datapoint available on them? If yes, kindly share source links and file. I will do it for you. Best, Junaid.
$75 AUD en 1 jour
7,8
7,8

Hello, I can extract the 26 specified data points from all 285 public-facing project pages using a clean, efficient scraping workflow (Python with BeautifulSoup/Scrapy, depending on site structure). Once you share the spreadsheet of URLs and field dictionary, I’ll map each field precisely to your column names and ensure consistent capture across every page. I’m comfortable handling pagination, minor structural variations, and edge cases while keeping the dataset aligned exactly to your supplied schema. The final delivery will be a single CSV or XLSX file containing the required columns and ~285 rows, fully populated wherever data exists. Before submission, I’ll apply uniform formatting, remove duplicates, validate column alignment, and provide a short note highlighting any URLs or fields that could not be captured. I can turn this around quickly once I review the file structure and confirm access. Best regards, Shamima
$50 AUD en 1 jour
8,0
8,0

Hello, I'm an experienced data entry specialist with expertise in extracting data from various websites. Happy to provide a sample & receive feedback. My time zone is GMT+7, which is 4 hours behind. If you are interested, please send me a private message. Best regards, Hoang
$143 AUD en 3 jours
7,7
7,7

Hi I have expertise in Web Scraping using Python and can provide you CSV/Excel file with required data points for all 285 project URLs. I'm available to start right away and complete this in 1 day. Abdul H.
$50 AUD en 1 jour
7,8
7,8

⭐⭐⭐⭐⭐ Extract Data from 285 Project Pages with Python and Scrapy ❇️ Hi My Friend, I hope you are doing well. I've reviewed your project requirements and noticed you're looking for data extraction from 285 project pages. Look no further; Zohaib is here to help you! My team has successfully completed over 50 similar projects focused on data extraction. I will efficiently gather the 26 data points you need using Python and BeautifulSoup or Scrapy. I will ensure the final dataset is clean and well-structured, formatted uniformly, and will drop any duplicates. You will receive a single Excel or CSV file that is ready for your analysis, all within your budget. ➡️ Why Me? I can easily do your data extraction project as I have 5 years of experience in web scraping and data manipulation, including Python, BeautifulSoup, and Scrapy. I also have a strong grip on data cleaning and formatting, ensuring that your dataset meets your requirements. ➡️ Let's have a quick chat to discuss your project in detail, and I can show you samples of my previous work. Looking forward to discussing this with you in chat. ➡️ Skills & Experience: ✅ Python Programming ✅ Web Scraping ✅ Data Extraction ✅ BeautifulSoup ✅ Scrapy ✅ Data Cleaning ✅ Excel Formatting ✅ CSV File Handling ✅ Data Deduplication ✅ Data Analysis ✅ API Integration ✅ Task Automation Waiting for your response! Best Regards, Zohaib
$150 AUD en 2 jours
7,9
7,9

Hi! I specialize in web scraping and data extraction with 9+ years of experience delivering clean, structured datasets quickly and accurately. Here's how I can help: * Scrape all 285 URLs to extract the 26 specified data points per page * Consolidate results into a single CSV/XLSX file with correct columns and uniform formatting * Remove duplicates and ensure all fields align with your dictionary * Flag any missing or problematic data for your review * Deliver a ready-to-analyze dataset with a short summary note Do you want me to use Python with BeautifulSoup, Scrapy, or do you have a preferred tool for this extraction?
$180 AUD en 7 jours
7,3
7,3

Hello, I'll be glad to assist you with this project. I can work with you during the next hours to have this done. I will complete this project with 100% accuracy. Click on the "CHAT" button so we will discuss it in detail. I'm always online and available. Please feel free to contact me at any time. I am available 24/7 for support. Best Regards Sandeep
$30 AUD en 1 jour
7,3
7,3

Hi! I can extract the 26 defined data points from all 285 project pages and return a clean, analysis-ready dataset quickly and accurately. I’ll use Python (BeautifulSoup or Scrapy depending on structure) to pull each field according to your dictionary, then consolidate everything into a single CSV/XLSX. Before delivery, I’ll: • Align columns exactly to your field names • Apply uniform formatting • Remove duplicates • Flag any missing or problematic URLs/fields You’ll receive a tidy file (13 columns × ~285 rows) plus a short note outlining any gaps. Once you share the sample file and field map, I can confirm structure and timeline right away.
$55 AUD en 1 jour
7,6
7,6

Hello, Yes — this is a straightforward extraction task and I’d be happy to handle it. I’m experienced with Python-based scraping (BeautifulSoup / Scrapy) and can efficiently collect your 26 data points across all 285 project pages. I’ll return a clean, well-structured CSV or Excel file with consistent formatting, duplicates removed, and columns aligned precisely with your field dictionary. Ready to begin once I review the file. Best regards, MD
$50 AUD en 1 jour
7,3
7,3

Hello I have several years of experience with automated Web Scraping and I have completed hundreds of Web Scraping projects on freelancer.com platform, I am experienced with Python, BeautifulSoup, Scrapy, very well I am ready to start once you share URLs, I evaluate it's easy work, therefore I change less than 35.
$33,50 AUD en 1 jour
7,2
7,2

Youssef, Full-Time Freelancer with Python Programming expertise in web scraping, data extraction, and complex automation workflows. I understand you need to scrape 26 specific data points from your 285 public project pages, delivering a clean CSV/Excel file. I can efficiently handle this extraction, using your provided spreadsheet of URLs and field dictionary to move straight to data capture. I'll leverage powerful Python tools like Scrapy, BeautifulSoup, or Playwright to ensure accurate data retrieval, even for dynamic content. The final dataset will be delivered as a single CSV, with basic uniform formatting, deduplication applied, and columns perfectly aligned to your field names. I will also include a clear note flagging any uncapturable URLs or fields. I have significant experience with projects involving precise data extraction and first-pass cleanup.
$250 AUD en 1 jour
7,3
7,3

As a leader of BN-Droids Digital Services, I offer you a rich blend of comprehensive web scraping skills and vast experience in the field. My team, comprised of five talented and professional members, delivers instant services of high quality. Over the past five years, my efforts have been consistently directed towards executing similar projects that involve scraping copious amounts of data. Our specialized and dedicated team is capable of extracting over 1 million data entries on a daily basis and even maintains a vast database consisting of more than 20 million retail data points. Hence, I assure you that your project consisting of gathering specific fields from 285 URLs will not be merely accomplished punctually, but with utmost expertise. We are proficient with various tools like Python, BeautifulSoup, Scrapy, which we can utilize for your project. To further assure you satisfactory results, BN-Droids team not only extracts data but constantly applies quality checks to ensure maximum accuracy in the final CSV/XLSX file. Allow me to utilize my skills and resources combined with your tools and URLs to complete this task effortlessly and professionally for you!
$30 AUD en 7 jours
6,9
6,9

Hi there. With over 13 years of experience specializing in customized Python web automation and data mining, I am confident that I have the skills, tools, and expertise to complete your web scraping project to perfection. I have a strong command of Python and all the relevant tools such as BeautifulSoup, Scrapy- just the tools you've highlighted for this project! I have completed projects similar to yours with a highly detailed focus on data extraction and analysis. I completely understand the urgency of this project and assure you my delivery will be swift without compromising on accuracy. As someone who has worked extensively in Python, Scraping, Data Entry/Processing projects - including transforming/analyzing bank statements/Police reports & migrating PDF data into Excel forms using OCR; I am deeply comfortable in executing all tasks needed for this project. By picking me for this job, you're also getting a professional who is well-versed in handling large datasets, delivering on-time results, and providing exceptional communication throughout. Let's connect today to discuss further how I can bring value to your project!
$30 AUD en 1 jour
7,1
7,1

Hi, I am expert in developing scripts to automate process of scrapping data from sites. I am available to start right away and provide you data within couple of hours. I will create script to get data, generate the sbapshot of page and deliver. I will also double check the accuracy with screenshots. I am available right away. Thanks
$30 AUD en 1 jour
7,0
7,0

Hi, Full-Time Freelancer with Python Programming expertise in web scraping, data extraction, and complex automation workflows. I understand you need to scrape 26 specific data points from your 285 public project pages, delivering a clean CSV/Excel file. I can efficiently handle this extraction, using your provided spreadsheet of URLs and field dictionary to move straight to data capture. I'll leverage powerful Python tools like Scrapy, BeautifulSoup, or Playwright to ensure accurate data retrieval, even for dynamic content. The final dataset will be delivered as a single CSV, with basic uniform formatting, deduplication applied, and columns perfectly aligned to your field names. I will also include a clear note flagging any uncapturable URLs or fields. I have significant experience with projects involving precise data extraction and first-pass cleanup.
$50 AUD en 1 jour
7,1
7,1

Hello, I have reviewed your project details and clearly understand the requirement to extract 26 structured data fields from 285 project URLs with clean formatting and accurate alignment. I can confidently handle this extraction workflow with reliable scraping and validation. I will begin by building a Python-based scraping pipeline using BeautifulSoup and Scrapy to parse each page efficiently while mapping every field to your provided dictionary. I will implement request handling, HTML structure validation, and fallback selectors to ensure consistent data capture across varying layouts. Next, I will structure the extracted dataset using pandas for cleaning, normalization, and duplicate removal, then export the final dataset into a properly formatted Excel or CSV file compatible with Microsoft Excel. Finally, I will perform verification checks and provide a short report highlighting any missing or blocked fields for transparency. Do the pages share a consistent HTML structure or should the scraper handle multiple layout variations? Let’s connect in chat to start immediately. Best Regards, Aneesa.
$250 AUD en 2 jours
6,8
6,8

Hi there, I’ll scrape all 285 URLs, extract the 26 data points accurately, clean and deduplicate the dataset, and deliver a polished CSV/XLSX ready for analysis. Best regards, Siddiqur Rahman.
$60 AUD en 1 jour
6,7
6,7

Hello Sir, It is my pleasure to be able to bid on your project. I am Ayan from India. I have carefully reviewed the details in your project description and am excited to work on your esteemed project. I have previously completed similar projects for my clients, so I am 100% confident that I can execute the project with a high quality that will satisfy you. I want to give my fullest and maximum contribution to its success. ✅ Why Me: ✔ 500+ projects completed on Freelancer.com ✔ 5.0-star rating and repeat international clients ✔ Fast turnaround and responsive communication For better clarification, you can also look at the 500+ excellent 5-star reviews I have earned on the Freelancer site. ✅ If you’d like, I can prepare a short sample using your instructions (a free pilot project) before I start fully. This will help you judge my adaptability and skill level. Just give me one chance; I will not let you down. Thanks Ayan
$200 AUD en 3 jours
6,7
6,7

Thank you for the clear scope. This is exactly the type of structured extraction work we handle regularly. With 285 URLs and a defined field dictionary, we can move straight into automated scraping using Python (BeautifulSoup or Scrapy), ensuring each of the 26 data points is captured accurately and mapped precisely to your supplied column names. You’ll receive a single clean CSV or XLSX file (13 columns × ~285 rows), fully populated where data exists and ready for immediate analysis—plus a short summary note on any capture gaps. Are the project pages consistently structured (same layout/template) or should we expect variations across URLs?
$200 AUD en 6 jours
7,2
7,2

Wagga Wagga, Australia
Méthode de paiement vérifiée
Membre depuis avr. 17, 2024
$30-250 USD
$250-750 AUD
$250-750 AUD
€12-18 EUR / heure
$30-250 USD
₹750-1250 INR / heure
$10-30 AUD
₹12500-37500 INR
₹100-400 INR / heure
$25-50 CAD / heure
$30-250 USD
$750-1500 USD
£20-250 GBP
$8-15 AUD / heure
$10-30 USD
₹1500-12500 INR
₹12500-37500 INR
$250-750 USD
₹12500-37500 INR
₹1500-12500 INR
$15 USD
₹600-1500 INR
₹600-1500 INR