Your guide to getting data entry done for your business
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Data Scraping is a process of extracting data from websites and databases. Data Scrapers are experts who can help collect this data and save clients both time and money by automating the process of data collection. These professionals use tools like Python, HTML, XML, Laravel, and more to access webpages and other sources of digital information in order to scrape large amounts of data and analyze it in meaningful ways.
Here's some projects that our expert Data Scraper made real:
Data Scraping is an incredibly valuable tool that can help companies increase efficiency by improving the process of their digital databasing. Our expert team of Data Scrapers is well-equipped to make these improvements in whatever form they’re needed. If you’re looking to make improvements to your own business through data scraping then why not post your project on Freelancer.com? Our Data Scrapers are ready to help you reach your goals.
Sur 129,615 commentaires, les clients ont évalué nos Data Scrapers 4.9 sur 5 étoiles.Data Scraping is a process of extracting data from websites and databases. Data Scrapers are experts who can help collect this data and save clients both time and money by automating the process of data collection. These professionals use tools like Python, HTML, XML, Laravel, and more to access webpages and other sources of digital information in order to scrape large amounts of data and analyze it in meaningful ways.
Here's some projects that our expert Data Scraper made real:
Data Scraping is an incredibly valuable tool that can help companies increase efficiency by improving the process of their digital databasing. Our expert team of Data Scrapers is well-equipped to make these improvements in whatever form they’re needed. If you’re looking to make improvements to your own business through data scraping then why not post your project on Freelancer.com? Our Data Scrapers are ready to help you reach your goals.
Sur 129,615 commentaires, les clients ont évalué nos Data Scrapers 4.9 sur 5 étoiles.More details: What type of websites are you referring to? Dating websites Do you have access to the login records of these social media websites? No, I need help getting access How soon do you need your project completed? ASAP I found some personal information was leaked in a website breach and I would like to see how many times it was used to sign up in websites.
I need a clean, reliable database of 100,000 email addresses belonging strictly to UK-based construction contractors, sub-contractors and related businesses. Every address has to be live and deliverable, with verification carried out through Mailgun so the final bounce rate stays well below two percent. The file must be delivered in CSV format and, besides the email itself, each record must also hold the company name, its location (city or county is enough) and a short trade / industry label that tells me whether they handle groundworks, roofing, fit-out, M&E, etc. No duplicates and no generic “info@” style inboxes, please—only contacts that decision-makers actually monitor. To keep everything compliant, source the data from publicly available material and follo...
I need assistance merging my current football dataset with a new one. This new dataset will be sourced from online scraping of weather and expected goals (Xg) data. Requirements: - Scrape data from official weather and football statistics websites. - Integrate the following weather data: temperature, humidity, and precipitation. - Work with datasets in Excel format. - Correlate this new data with historical football match data in my existing dataset. Ideal Skills and Experience: - Proficiency in data scraping and data manipulation. - Experience with Excel and handling large datasets. - Familiarity with weather and football data. - Strong analytical skills to ensure accurate correlation of datasets. Looking forward to your proposals!
I need a reliable solution that will pull every public post mentioning a set of keywords I will share with you and do so for all the data on X, Instagram and Facebook. The scrape must cover three primary markets I am focused on right now—Thailand, Philippines and India, etc —so geo-filtering or language filtering needs to be baked in from the start. For every matching post I want the full engagement picture captured: the comment text, number of comments, likes, reposts/shares, the post date and any other readily available metadata (author handle, follower count, post URL, media links, etc.). Accuracy is critical because the data will feed a trend-analysis dashboard later. Please build the workflow in a way that respects rate limits and login requirements: if you intend to use...
Virtual assistance for testing in Denmark We are looking for a QA specialist, a virtual assistant or a data scraper based in Denmark for long-term cooperation to test mobile and web applications. The ideal candidate must be detail-oriented, reliable, and able to strictly follow provided instructions and test scenarios. No special skills are required to complete the test
AI Automation for Finance Analytics AI / Machine Learning DO NOT BID IF BIDDING FOR 40-HOUR WORK WEEK WE ARE LOOKING FOR A CONSULTANT / BUILDER / TUTOR TO WORK WITH OUR TEAM 3-10 HOURS A WEEK TO BUILD THE SYSTEM JONITLY DO NOT BID FOR LONGER THAN THOSE HOURS. DO NOT BID FOR FULL-TIME WORK DETAILS OF WHAT I NEED HELP WITH I run a real estate private equity and hotel development platform. We want to replace manual analysis and reporting with a practical AI workflow. This is about extracting, comparing, and interpreting data. Excel and PowerPoint remain the source of truth. What we need: -Compare PowerPoint vs Excel and flag mismatches - Explain underwriting models and trace outputs - Compare legal/term sheets vs financial assumptions - Track document versions and changes - Summarize deal...
AI Automation for Finance Analytics AI / Machine Learning DO NOT BID IF BIDDING FOR 40-HOUR WORK WEEK WE ARE LOOKING FOR A CONSULTANT / BUILDER / TUTOR TO WORK WITH OUR TEAM 3-10 HOURS A WEEK TO BUILD THE SYSTEM JONITLY DO NOT BID FOR LONGER THAN THOSE HOURS. DO NOT BID FOR FULL-TIME WORK DETAILS OF WHAT I NEED HELP WITH I run a real estate private equity and hotel development platform. We want to replace manual analysis and reporting with a practical AI workflow. This is about extracting, comparing, and interpreting data. Excel and PowerPoint remain the source of truth. What we need: -Compare PowerPoint vs Excel and flag mismatches - Explain underwriting models and trace outputs - Compare legal/term sheets vs financial assumptions - Track document versions and changes - Summarize deal...
I am currently using apify for $1.5/1000 leads. Need things at scale - around 50k emails, this need cost effective solution. Bid on this proposal and I shall DM you, need to know cost for: 1. Apollo emails 2. Linkedin emails
I need a senior-level specialist to harvest product data from several e-commerce sites and deliver it in a single, well-structured CSV file. The task demands production-ready techniques—think Scrapy spiders hardened with rotating proxies, Selenium or Playwright for dynamic content, and solid anti-bot countermeasures. The information I’m after is very specific: product names, prices, pictures, and SKU. Nothing less, nothing more. Your solution must run reliably at scale, cope with frequent layout changes, and leave no trace that could trigger blocks. Python is the preferred stack, but if you have a proven alternative that meets the same bar, I’m open to hearing it. To be considered, include in your proposal: • At least one example of a comparable e-commerce scrapi...
PDF to Excel Data Scraper Needed Job Title: Data Scraper Needed: Convert 24 PDF Factsheets to Clean Excel (Mutual Fund Portfolios) Project Overview: I need a freelancer to extract detailed stock portfolio data from ~24 Mutual Fund Monthly Factsheets (PDFs). I will provide the URLs/Files. Your job is to extract the full stock holdings table for specific funds and deliver a consolidated, clean Excel/CSV file. The Goal: I need the complete list of stocks (100% of the portfolio), NOT just the Top 10. The data is used for financial backtesting, so accuracy is critical. Even top 85-90% data works. Scope of Work: Input: ~24 PDF Files (Monthly Factsheets). Target Funds: For each month, extract data for the Top 10 Equity Funds (e.g., Bluechip, Midcap, Smallcap, Value Discovery, etc. - list wi...
I need a seasoned Python developer to build a robust scraper that collects the required data and writes it straight to JSON—no additional cleaning or processing necessary. Once we begin I’ll provide the target URL(s) and any access details; for now, assume a standard public site with pagination and occasional anti-bot checks. Core expectations • Written in Python 3 using requests/BeautifulSoup or Scrapy; resort to Selenium only if there’s no lighter workaround. • Handles pagination, retries, and polite delays gracefully so the run can complete unattended. • Config file or clear constants for headers, cookies, and start URLs, letting me tweak targets without editing core logic. • Produces a single JSON file (or one file per page if that’s...
I need to build a reliable, well-structured lead list and I already know exactly what it should contain. The task is to extract contact information—email addresses, phone numbers and full mailing addresses—from three sources: company and organisation websites, their public social-media profiles, and well-known online directories. I expect the data to be gathered with a solid scraping workflow (Python, Scrapy, BeautifulSoup, Selenium or an equivalent stack is fine) and then verified so that bounced emails and dead numbers are kept to an absolute minimum. Deliverables • One CSV or Excel file with separate columns for name, company, job title, email, phone, street address, city, state, ZIP/postcode, country, source URL and date collected. • No duplicates; every...
PDF to Excel Data Scraper Needed Job Title: Data Scraper Needed: Convert 24 PDF Factsheets to Clean Excel (Mutual Fund Portfolios) Project Overview: I need a freelancer to extract detailed stock portfolio data from ~24 Mutual Fund Monthly Factsheets (PDFs). I will provide the URLs/Files. Your job is to extract the full stock holdings table for specific funds and deliver a consolidated, clean Excel/CSV file. The Goal: I need the complete list of stocks (100% of the portfolio), NOT just the Top 10. The data is used for financial backtesting, so accuracy is critical. Even top 85-90% data works. Scope of Work: Input: ~24 PDF Files (Monthly Factsheets). Target Funds: For each month, extract data for the Top 10 Equity Funds (e.g., Bluechip, Midcap, Smallcap, Value Discovery, etc. - list wi...
We’re looking for 20 freelancers to help with: Screening apartment applicants which are not on Airbnb, yet so we can get them listed. Our Cities: London / Stockholm / Barcelona / IBIZA / Los Angeles / Miami Answering tenant questions. You’re a fit if: - Sharp and fast - Friendly and ambitious - Detail-oriented and reliable - Comfortable working independently The income is around is about 1000,- to 3000,- $ Send a short intro with your experience and availability. Sincerly, Alex & Dimpi
For an upcoming market research study, I need a fully-automated workflow that gathers and enriches data from well over 500 LinkedIn profiles. The automation should locate the profiles that match criteria I will provide, pull the key public details, then append reliable off-platform contact information so I can reach those professionals directly. Please design the script or low-code sequence with any reliable stack you prefer—Python, Selenium, PhantomBuster, Sales Navigator API, or comparable tools are fine as long as the method is repeatable and respects rate limits. Deliverables • CSV/Excel file containing one row per person with: – Current job title – Company name – Verified email (and phone, when available) • Source code or workflow fi...
Virtual assistance for testing in Burkina Faso,Cameroon, Ghana, Guinea. We are looking for a QA specialist, a virtual assistant or a data scraper based in Burkina Faso, Cameroon, Ghana, Guinea for long-term cooperation to test mobile and web applications. The ideal candidate must be detail-oriented, reliable, and able to strictly follow provided instructions and test scenarios. No special skills are required to complete the testing
Need existing database of European countries which u scrapped earlier
The contractor is commissioned to download DRM-protected videos from an online portal to which the client has legitimate access and usage rights. The videos must be processed as follows: - Download approximately 240 videos from the portal with about 18 hours video material - The videos have an average length of approximately 5 minutes - Original video titles must be preserved - The videos must be organized into folders according to the portal order/structure - All files must be uploaded and stored on Google Drive - The final folder structure on Google Drive must be same like on the portal
Virtual assistance for testing in Burkina Faso,Cameroon, Ghana, Guinea. We are looking for a QA specialist, a virtual assistant or a data scraper based in Burkina Faso, Cameroon, Ghana, Guinea for long-term cooperation to test mobile and web applications. The ideal candidate must be detail-oriented, reliable, and able to strictly follow provided instructions and test scenarios. No special skills are required to complete the testing
The dataset I need transferred contains both text and numerical values and must be entered faithfully into Word and Excel templates I will supply. Consistency across both formats is essential, so I expect close attention to detail as you type, verify, and cross-check each entry. I work to tight schedules, therefore speed is important—but never at the expense of accuracy. A brief note outlining your prior data-entry experience will help me gauge how quickly you can adapt to the structure of my files and any keyboard shortcuts I already have in place. I will hand over the source documents, the destination spreadsheets, and clear naming conventions once we agree on timing. If you spot anomalies or unreadable characters while you work, flag them so we can resolve issues in real time. ...
Virtual Assistant Especializado en Prospección y Research Comercial (Lead Generation B2B) Estamos buscando un Virtual Assistant altamente cualificado y proactivo, con un perfil semi-comercial y analítico, enfocado en la prospección y la investigación comercial. Este rol no es para un asistente administrativo clásico, sino para alguien que piense y actúe como un Sales Development Representative (SDR) junior. Responsabilidades clave: Investigación de leads: Buscar negocios en el sector beauty & wellness por ciudad. Identificar decisores clave (dueños/directivos). Validar el tamaño y la adecuación de los leads con nuestro programa/licencia. Extracción y enriquecimiento de datos: Recopilar correos electr&oac...
I need a fresh, accurate Gmail list of people who own their homes in the United States, preferably those in the older age brackets. The file should be fully verified so every address is active and formatted for easy import into my CRM (Excel). Details I expect for each lead • 200K Email List • All Active Usa Mail • Verified •Remove Duplicates • First and last name • Gmail address • City / State (zip code if available) • Confirmation that the contact is a homeowner
I have an urgent need for a clean, well-structured dataset containing the listing agent’s first name, last name, mailing address, and phone number for well over 500 active Zillow listings. Speed is critical, but accuracy matters just as much; the final file should be ready for immediate import into my CRM. You are free to use whichever stack you prefer—Python with BeautifulSoup or Scrapy, Selenium, residential proxies, even the unofficial Zillow API—so long as rate-limits are respected and the data is complete. I don’t need property details or price history; the focus is strictly on the agent contact fields. Deliverables • CSV or XLSX with a separate column for each required field • A short read-me explaining the script or method so I can rerun it la...
I need a clean pull of every location listed on For each branch please capture: country, state, complete address, service type, phone number, and email address. The final deliverable is a single Microsoft Excel workbook containing one sheet only. All columns should be clearly labelled and the range converted to an official Excel Table so I can apply native filters instantly. No additional filtering is required on your side; just be sure the table structure supports easy filtering by any column once I open the file. Accuracy matters more than speed—every location on the site has to be included and the contact details must match what is shown online. When you hand over the file I will spot-check a sample of entries against the live site to confirm completeness and correctness bef...
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Learn how to hire and collaborate with a freelance Typeform Specialist to create impactful forms for your business.
A complete guide to finding, hiring, and working with a skilled freelance typist for your typing projects.