Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.

With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.

Here's some projects that our expert Web Scraping Specialist made real:

  • Web searches and collecting data
  • Data transfers between websites
  • Downloading images from url and insertion to database
  • Automating the sending of emails and SMSs
  • Collecting website data and exporting it to spreadsheets
  • Creating custom bots to generate or collect online user feedbacks
  • Collecting contact details, business leads, influencers or any other specific data
  • Creating dictionaries with official languages of the world apart from English

Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!

From 361,926 reviews, clients rate our Web Scraping Specialists 4.9 out of 5 stars.
Hire Web Scraping Specialists

Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.

With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.

Here's some projects that our expert Web Scraping Specialist made real:

  • Web searches and collecting data
  • Data transfers between websites
  • Downloading images from url and insertion to database
  • Automating the sending of emails and SMSs
  • Collecting website data and exporting it to spreadsheets
  • Creating custom bots to generate or collect online user feedbacks
  • Collecting contact details, business leads, influencers or any other specific data
  • Creating dictionaries with official languages of the world apart from English

Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!

From 361,926 reviews, clients rate our Web Scraping Specialists 4.9 out of 5 stars.
Hire Web Scraping Specialists

Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    90 jobs found

    I need a Python-based pipeline that can routinely scrape key metrics from a shortlist of social-media profiles, crunch the numbers with Pandas, and surface everything in a clean, interactive Streamlit dashboard. Here is the flow I have in mind. A Requests-powered spider (or another lightweight HTTP approach if you prefer) grabs publicly available post data, follower counts, engagement numbers, and any other fields we identify as useful. The raw pulls are saved locally as CSV or funneled straight into a lightweight SQLite store—whichever keeps the hand-off to Pandas friction-free. Once there, Pandas should calculate rolling engagement rates, growth curves, top-performing content, and daily deltas. Finally, Streamlit turns those tables into plots, sortable tables, and headline KPI car...

    $472 Average bid
    $472 Avg Bid
    14 bids

    Project Title: Automated Web Scraper & AI Summarizer for Legal Documents (Italian Administrative Justice Portal) Project Overview: I need an automated workflow to monitor judicial rulings from the Italian Administrative Justice website (). The system should perform weekly searches based on dynamic keywords (e.g., "appalti"). Requirements: Web Scraping: Create a scraper (using Python/Playwright or Browse AI) that can bypass anti-bot protections, input keywords, and extract PDF links and metadata for rulings published in the last 7 days. Cloud Integration: Use (Integromat) to organize the results. For each keyword, create/update a specific Google Drive folder and upload the new PDFs. AI Processing: Integrate Google Gemini API to: Analyze the rulings within each folder. ...

    $338 Average bid
    $338 Avg Bid
    66 bids

    I’m assembling a nationwide database of barbers and beauty-salon owners in the United States and need your help compiling it. Every record must be fully verified and include three data points: • Owner’s full name with current phone number and email address • Business name with complete street address (city, state, ZIP) • Key service specialties the shop promotes (e.g., fades, coloring, micro-blading, etc.) Accuracy is critical, so I will spot-check contact details and discard any duplicates or outdated entries. Please provide the finished list in a clean spreadsheet or CSV that I can filter and sort easily. If you already have a recent, reliable dataset, let me know; otherwise, outline how you’ll source and verify each contact before I award the p...

    $30 Average bid
    $30 Avg Bid
    28 bids

    I need help pulling the exact wording that appears on a series of public web pages so I can repurpose it into fresh Facebook posts. I will send you a list of URLs; your task is to visit each page, copy every headline, sub-heading, call-to-action, and any punchy phrases that could spark engagement, then place them in a Google Sheet along with the page link they came from. Because the end goal is content creation for Facebook, accuracy and context matter: keep line breaks where they add meaning, omit navigation or footer clutter, and do not paraphrase—just extract the text verbatim. Deliverables • Google Sheet with one row per snippet: original wording, source URL, and character count • Quick note if a page blocks scraping or is empty If you already use tools suc...

    $36 / hr Average bid
    $36 / hr Avg Bid
    61 bids

    PROJECT OVERVIEW I am looking for a detail-oriented researcher to perform a competitive analysis of the Restoration & Mitigation industry (Water, Fire, and Environmental services) in specific geographic regions. SCOPE OF WORK 1) Target Identification: Identify active service providers/contractors currently utilizing Google Local Service Ads (LSAs) within our target markets. 2) Data Extraction: For each qualifying business, compile a clean spreadsheet including: A - Company Name, Physical Address, website if applicable B - Primary Office Line. C - Key Decision Maker (Owner/Principal): Identify the primary point of contact. D - Direct Contact Information: Provide verified Direct Dials or business contact lines for the identified decision-maker. Requiremen...

    $43 Average bid
    $43 Avg Bid
    23 bids

    Hello, I am building a small data collection tool for a medical AI research project. The goal is to collect doctor-patient role-play audio data and prepare it for AI training. I need a simple system with the following features: 1) Web-based interface (works on mobile and desktop) 2) Upload audio files (WAV/MP3) 3) Automatic speech-to-text using Whisper 4) Editable transcript (dialect correction) 5) Field for medical Arabic normalization 6) Term notes input 7) Automatic folder/package generation per session 8) Cloud storage (Google Drive / S3 or similar) This is Phase 1: Data collection only (no diagnosis, no reports). Tech preference: - Frontend: Web (React or simple HTML) - Backend: Python (Flask/FastAPI) - ASR: Whisper - Storage: Cloud + Local Deliverable: A working MVP that allow...

    $774 Average bid
    $774 Avg Bid
    118 bids

    I need investment performance data scraped from a research website and organized into an Excel file. It needs to be set up for scraping on a regular basis. Requirements: - Experience with web scraping tools and techniques - Ability to extract and format data accurately - Proficiency in Excel - Attention to detail and reliability Ideal Skills: - Familiarity with Python, BeautifulSoup, or similar scraping libraries - Prior experience with financial data - Strong data handling and manipulation skills Please provide a sample of your previous work related to web scraping.

    $29 - $58 / hr
    Sealed NDA
    $29 - $58 / hr
    70 bids

    I have a collection of websites that contain the text I need organised in a single Excel spreadsheet. Your task is straightforward: visit each assigned site, copy the required text exactly as it appears, and paste it into the correct columns and rows of the workbook I supply. Accuracy and consistency matter more than speed. Please keep original spellings, line breaks, and capitalisation, and double-check that no unseen characters or extra spaces slip in. The spreadsheet already has headers; all you do is populate the empty cells beneath them. I will share: • The list of URLs • A short field-by-field guide so you know which snippet of text belongs in which column • The blank .xlsx file You return: • The completed Excel file, ready for me to import into our system ...

    $3 / hr Average bid
    $3 / hr Avg Bid
    78 bids

    I need a compact Python crawler that pulls public content from Twitter, Instagram and LinkedIn, covering text, image and video posts for any handle I feed it. Here’s the flow I have in mind. The script collects the raw post data (caption, hashtags, basic engagement numbers and, where accessible, image/video URLs) through whichever mix of libraries makes sense—Tweepy or Twitter API v2 for Twitter, Instaloader or Selenium for Instagram, and the official or unofficial LinkedIn API for LinkedIn. After normalising everything into a common JSON schema, the crawler should pass that dataset to an LLM endpoint (OpenAI or similar) and receive back a concise, structured report that includes: • Brand sentiment (positive / neutral / negative trends) • Key thematic buckets t...

    $176 Average bid
    $176 Avg Bid
    8 bids

    Scope of Work: Automated Data Extraction and Alert System from Etimad Tenders Portal Project Title: Automated Data Scraping, Google Sheets Integration, and Email Alerts for Etimad Pre-Planning Opportunities 1. Background The Etimad platform ( ) publishes pre-planning and upcoming tenders from Saudi government entities. The goal is to develop a fully automated system within two days that extracts tender data, stores it in a structured Google Sheet, and sends email alerts when new opportunities appear that match specific keywords (to be provided). 2. Objectives Automatically extract all relevant tender data from the Etimad Pre-Planning portal. Store and update the extracted data in Google Sheets on a scheduled basis. Detect and highlight new tenders. Send email alerts for new tenders...

    $232 Average bid
    $232 Avg Bid
    74 bids

    I want a straightforward, low-cost way to pull every unique email address from well over 5,000 messages in my Gmail account. No names, phone numbers, or other fields—just the raw sender and recipient addresses that appear in the header or body of each email. You are free to tackle this with the Gmail API, IMAP, Google Apps Script, Python’s gmail-api-client, or any other method you trust, as long as it stays within Google’s usage limits and my account remains secure. Speed is important but data accuracy matters more; I need a clean list with duplicates removed and obvious bounces, “no-reply”, and Google system addresses filtered out. Deliverable • CSV or XLSX file containing every unique, valid email address extracted from the full mailbox. Acceptance...

    $33 Average bid
    $33 Avg Bid
    25 bids

    CONTACT ENRICHMENT PROJECT: We need contact information (CEO/President, Plant Manager, Purchasing Manager) for automotive supplier companies. The "To Enrich" sheet contains companies that need contact data. The "Already Done" sheet shows completed examples. WHAT TO FILL IN (Yellow columns in "To Enrich" sheet): Column G: CEO/President Name Full name of CEO, President, Managing Director, or General Manager Column H: CEO/President Title Their exact title (e.g., CEO, President, Managing Director) Column I: CEO/President Email Professional email if available (not required) Column J: Purchasing Contact Name Name of Purchasing Manager, Procurement Director, or Head of Purchasing Column K: Purchasing Contact Title Their exact title Column L: Purchasing Contact ...

    $666 Average bid
    $666 Avg Bid
    127 bids

    I need help extracting text data from CSV files and inputting it into a specified format or system. Requirements: - Experience with data extraction and manipulation - Proficiency in handling CSV files - Attention to detail to ensure accuracy - Ability to work within the specified budget Ideal Skills: - Data entry - Familiarity with spreadsheet software (e.g., Excel) - Basic understanding of data organization principles Looking for freelancers who can complete the task efficiently and accurately. this is the full requirment for the project Your Operational Requirements (Summary) 1) Quote & Job Intake Create quotes (Q-files) and job packs (J-files) as you do today Save them into SharePoint folders System automatically reads them and creates: Quote records Assets Tasks per as...

    $856 Average bid
    $856 Avg Bid
    172 bids

    I already have a curated list of LinkedIn profile URLs and need the key networking details moved into a single Google Sheet. For every profile, capture each person’s stated interests and list the five types of people they say they want to meet. Those “meet-up” types should be tagged under the three clear categories I care about—Industry experts, Potential clients and Collaborators—so that I can later filter the sheet by networking goal. Please place one profile per row in the Google Sheet and create separate columns for: • Profile URL • Name (as it appears) • Interests (comma-separated) • Type 1 through Type 5 (verbatim wording) • Category tag (Industry experts / Potential clients / Collaborators) Accuracy of the text you...

    $17 Average bid
    $17 Avg Bid
    15 bids

    I need a small utility that sits in the background, pings a single public web page every second, and alerts me the moment it detects any difference in either the visible text or the images. The page updates unpredictably, so true real-time tracking is essential; a one-minute polling interval is already too slow for my use-case. A lightweight approach that respects the site’s bandwidth and avoids triggering blocks or captchas will be valued. I am open to whatever stack you favour—Python with BeautifulSoup or Selenium, Node.js with Puppeteer, or a compiled solution—so long as it is stable on a Windows environment and easy for me to tweak the target URL later. Notification method is flexible: an email is fine, but if you have a smarter suggestion (desktop toast, webhoo...

    $18 Average bid
    $18 Avg Bid
    14 bids
    Find 15K B2B Emails
    5 days left
    Verified

    I have an excel document of roughly 15,000 B2B companies operating in whole the world. For every company on that list, I need one working, business-grade email address. Preferred research channels are clear: • Company websites • LinkedIn profiles • Reputable public databases (apollo) Accuracy is more important than speed. If an address cannot be found after reasonable effort, note “No email found” in the sheet so I can audit the attempt. Deliverables: 1. Completed CSV or Excel file containing – Company name (as provided) – Discovered email address – Source link or brief source note 2. A short summary of any recurring issues you encountered (e.g., companies with only web forms). Please keep formatting consistent and a...

    $187 Average bid
    $187 Avg Bid
    62 bids

    I need an end-to-end AI agent that automatically scouts freelancing websites, general job boards, and specialised training platforms for roles or courses that involve artificial-intelligence work. The agent must: • Crawl and scrape the relevant pages in real time or on a frequent schedule. • Apply NLP or other classification techniques to decide whether a posting is truly AI-related, then tag it by sub-domain (e.g. vision, NLP, MLOps, prompt-engineering). • Deliver concise, deduplicated listings to me through an in-app notification feed—no email or SMS required. For the deployment side I’m open to Python (Scrapy, BeautifulSoup, Selenium), Node, or any stack you are comfortable with so long as it is containerised and can run unattended on a small cloud insta...

    $106 Average bid
    $106 Avg Bid
    14 bids

    I need a single Google Sheet that lets me type a Home Depot or Lowe’s model number in one column and, without any extra clicks, instantly fills the rest of the row with: • Brand • Product title • Full description • Current price on Home Depot • Current price on Lowe’s • Main image URL I’d like the sheet to refresh these fields automatically once every day so I always see up-to-date pricing. I’m happy to use the HASData API (or another service if you can show a better option), and I’ll cover the subscription cost myself; I just need you to wire everything up in Apps Script or another reliable method so the calls stay within API limits and don’t break if the catalog grows. Deliverables • A Google Sheet t...

    $40 Average bid
    $40 Avg Bid
    45 bids

    Title: Senior Python Developer for US Data Pipeline and iOS Verification System (Phase 1) Project Description Suggestion: Overview: > We are looking for a senior Python developer to build an automated data scraping and iOS verification pipeline based in the US. The goal for Phase 1 is to acquire over 10,000 verified leads per day. Core Tasks: 1. Data Scraping: Extract data (name, phone number, age, gender, carrier) from US people search websites. 2. Anti-detection: Must integrate the API and set render=true and super=true. 3. Data Filtering: Implement automatic filtering by wireless/phone number and age range (50-90 years old). 4. Data Verification: Integrate the LoopLookup API to verify iMessage activation status. 5. Data Export: Automatically sort and export data to tagged .t...

    $4678 Average bid
    $4678 Avg Bid
    116 bids

    Hey! I’m looking to hire an experienced developer to build a universal product-detail scraping pipeline that takes a product URL (any website) and returns a complete structured product record. This is not a “simple HTML parse.” Many target sites are React/Next/Vue, load content via XHR/GraphQL, hide details behind tabs/accordions/modals, and lazy-load images/PDFs. The solution needs to reliably extract everything a human can see on the page, plus the underlying data used to render it. What the scraper must do (high level) Given a product URL, the pipeline should: Load the page like a real user (handle cookies/overlays). Capture all content from multiple sources (DOM + network + interactions). Use GPT API strategically to increase accuracy (field mapping, variant ext...

    $268 Average bid
    $268 Avg Bid
    130 bids

    I need help collecting a clean, well-structured list of Twitter accounts that consistently post about AI and possibly category of AI (open source, ML, AI, general AI) Instead of handing you a fixed list, I’ll define the selection rules (for example: minimum follower count, specific AI-related keywords, recent activity, etc.) - min follower count 5000 and have alteast multiple posts with 100+ likes/ retweets. Once those criteria are agreed on, you’ll locate the matching profiles and extract two data points per account: • the public profile bio • the direct profile link (around 1M+ profiles) Please return everything in a single CSV file, one row per influencer. Feel free to use Python, Tweepy, Twitter API v2, ScraperAPI, or another reliable method—as long...

    $134 Average bid
    $134 Avg Bid
    17 bids

    AI Automation for Finance Analytics AI / Machine Learning DO NOT BID IF BIDDING FOR 40-HOUR WORK WEEK WE ARE LOOKING FOR A CONSULTANT / BUILDER / TUTOR TO WORK WITH OUR TEAM 3-5 HOURS A WEEK TO BUILD THE SYSTEM JONITLY DO NOT BID FOR LONGER THAN THOSE HOURS. DO NOT BID FOR FULL-TIME WORK DETAILS OF WHAT I NEED HELP WITH I run a real estate private equity and hotel development platform. We want to replace manual analysis and reporting with a practical AI workflow. This is about extracting, comparing, and interpreting data. Excel and PowerPoint remain the source of truth. What we need: -Compare PowerPoint vs Excel and flag mismatches - Explain underwriting models and trace outputs - Compare legal/term sheets vs financial assumptions - Track document versions and changes - Summarize deal ...

    $31 / hr Average bid
    $31 / hr Avg Bid
    116 bids

    I have a single Instagram Reel that was publicly available for roughly a year before being removed or placed in archive. I saved every trace I could—direct links, full-length screen recordings, and the search-engine cache hits that still reference the post. What I now need is a technical reconstruction of its viewership data. Your objective is to extract and corroborate: • Number of views over time (ideally plotted or tabled) • Any available demographic clues about who watched it • Engagement rates the Reel achieved while live Because the original URL now returns a 404, I expect most of the intel will come from open-source techniques: exploring Web archives (Wayback Machine snapshots, Google cache, ), digging into any residual JSON, and cross-referencing with Ins...

    $25 Average bid
    $25 Avg Bid
    12 bids

    I need an experienced Python trading-bot developer to optimize and refactor a live async trading bot connected to REST & WebSocket APIs, which currently slows under load and misses ticks/orders. The task includes profiling bottlenecks, improving async/WebSocket performance, optimizing pandas & SQLite usage, and ensuring real-time execution. Goal: <200 ms tick-to-order latency, zero missed ticks, clean refactored code, tests, and one-command VPS setup.

    $35 Average bid
    $35 Avg Bid
    20 bids
    Scrape & Import Products
    4 days left
    Verified

    I need every product on copied into my existing WooCommerce shop so the catalog mirrors theirs one-to-one. That means grabbing each item’s title, plain-text description and all associated images, then pushing them into WordPress with the correct categories, colour swatches and size variations exactly as they appear on Furnx. Only product details are required right now—reviews and live stock counts can wait for a later phase—so the job is focused on clean data capture and a flawless import workflow. Descriptions must remain in plain text; no extra HTML markup. Images should arrive attached to the right variation, including separate gallery shots where available, and the colour options need to show as clickable swatches in WooCommerce, not just text labels. I’m...

    $27 / hr Average bid
    $27 / hr Avg Bid
    192 bids

    I’m looking for an experienced AutoHotkey (AHK) developer to build a clean, reliable script that automates the repetitive navigation and clicking I perform every day inside my web application. Here’s the core scenario: the macro will launch a browser tab, step through a predictable series of pages, click specific buttons or links, wait for elements to load, and continue until the end of the workflow—no data scraping or form filling is required, just fast, accurate page-to-page movement and element selection. I’ll provide: • A screen-recording that shows the exact click path and timing cues • XPaths, CSS selectors, or unique element IDs where available • Any login credentials needed for testing (in a secure manner) You’ll deliver: ...

    $43 Average bid
    $43 Avg Bid
    20 bids

    I need a verified list of 100 LinkedIn contacts who sit at mid-management level—think Managers and Directors and Vice Presidents—inside grocery, retail and beverage store chains across the country. I will give you the exact companies and job titles to hunt for, so your job is purely about smart searching and clean data capture. Once you locate a match, record the person’s full name, current title, company,, LinkedIn profile URL and any publicly available work email (if it can be found) you can source. Please keep everything tidy in a spreadsheet (Excel or Google Sheets is fine) and make sure there are no duplicates, no consultants, and nobody outside the United States. If you use tools like LinkedIn Sales Navigator, Apollo, Hunter, etc., mention that when you bid so I k...

    $179 Average bid
    $179 Avg Bid
    58 bids

    I have two source spreadsheets that I need merged and enriched through automated scraping: • “File 1” – 170 k Spanish local businesses with emails • “File 2” – 65 k additional businesses with websites only Phase 1 – Email extraction Using a Python script and well-known libraries (requests, BeautifulSoup, Scrapy or similar), scan every site listed in File 2, capture all working email addresses you can locate, then append them to the corresponding rows so I can produce a unified “File 3”. Phase 2 – Offer harvesting Next, visit each live site in File 3. Where an offer, deal or promotion is publicly displayed, record the details in a fresh Excel sheet with these exact columns: Business ID | Business Name | Offer...

    $187 Average bid
    $187 Avg Bid
    18 bids

    Summary We’re a growing digital marketing agency looking for an experienced automation specialist to help us design and implement scalable internal workflows. We’re moving into a more automated operating model and want to work with someone who can both advise on best practices and build the systems with us. What We Need We’re looking to automate processes such as: Automatically scraping a new client’s website and relevant public social profiles upon signup Structuring and exporting that data into organized files (Google Drive/Docs/Sheets) Creating standardized client folder structures in Google Drive Connecting onboarding forms to project management tools Automating internal task creation for our team Integrating AI tools (e.g. GPT workflows) into onboardin...

    $813 Average bid
    $813 Avg Bid
    71 bids

    I need a data scraping expert to help generate leads from a list of websites. Requirements: - Scrape contact information, product listings, or user reviews (to be specified). - Work from a provided list of URLs. Ideal Skills: - Experience with data scraping tools and techniques. - Ability to handle multiple URLs and extract data accurately. - Attention to detail and reliability. Please share your portfolio and relevant experience.

    $103 Average bid
    $103 Avg Bid
    33 bids

    My goal is to boost overall search accuracy across web, conversational, and voice-based platforms, and I need a small team that can run continuous quality checks on three fronts: • First, you will rate the relevance of live web search results against real user queries, flagging mismatches and edge cases. • Second, you will review AI-generated snippets, answers, and summaries, highlighting factual errors, bias, or tone problems and suggesting concise fixes. • Third, you will test voice recognition output by speaking prescribed prompts, noting transcription errors, pronunciation gaps, and language-variant issues. I will supply detailed guidelines, evaluation rubrics, and annotation tools; you simply log in, follow the task queue, and record findings inside the platform. ...

    $55 / hr Average bid
    $55 / hr Avg Bid
    117 bids

    I have a collection of websites that contain the text I need organised in a single Excel spreadsheet. Your task is straightforward: visit each assigned site, copy the required text exactly as it appears, and paste it into the correct columns and rows of the workbook I supply. Accuracy and consistency matter more than speed. Please keep original spellings, line breaks, and capitalisation, and double-check that no unseen characters or extra spaces slip in. The spreadsheet already has headers; all you do is populate the empty cells beneath them. I will share: • The list of URLs • A short field-by-field guide so you know which snippet of text belongs in which column • The blank .xlsx file You return: • The completed Excel file, ready for me to import into our system ...

    $383 Average bid
    $383 Avg Bid
    123 bids

    I have an existing Excel sheet of aged Instagram leads and I need a fresh, human-eyed review of every single profile on that list. Your task is simple but detail-oriented: open each handle, confirm it is still active by seeing if they posted anything hair related in the last month and note the information I specify below. Here is exactly what I want verified on every profile: • Recent posts activity – record the date of the most recent post so I can see at a glance who is active and who is dormant. • Availability of contact information – confirm whether an email, phone number, or “email” button is visible in the bio or contact section. No bots or scraping tools, please; I want a manual check for accuracy. Deliverables • The original Excel file re...

    $13 / hr Average bid
    $13 / hr Avg Bid
    86 bids

    Industrial Automation Product Data Extraction, Deduplication & Structured Image Collection Project Overview We are an industrial automation parts distributor building a structured product database to support inbound enquiries and SEO growth. We require an experienced data extraction specialist to: Extract structured product data from major industrial / electronic component distributor websites Identify duplicate manufacturer part numbers across multiple sources Merge all unique information into a single consolidated dataset Extract and organise all available product images per part number Deliver a clean, deduplicated, production-ready dataset This project includes: Data extraction Normalization Deduplication Intelligent merging Structured image collection and organisation...

    $1077 Average bid
    $1077 Avg Bid
    74 bids
    AI Rewrite & Data Insertion
    3 days left
    Verified

    I have a collection of web pages that I need turned into clean, original copy and then loaded into my system. The raw material is plain-text extracted directly from those pages—no numerical or mixed data involved—so the entire job revolves around handling text content only. Here is the workflow I have in mind: first you’ll grab the plain text from each specified URL, strip away anything that isn’t core content, and feed that text into your preferred rewriting engine (OpenAI, GPT-based, or another high-quality NLP tool). The goal is a fluent, human-sounding rewrite that preserves meaning while clearing any potential plagiarism checks. Once the rewrite is approved, you will insert the new text back into the destination I provide (CSV template or the web form in my CM...

    $209 Average bid
    $209 Avg Bid
    156 bids

    I need a sharp, Excel-savvy researcher to turn scattered developer brochures and website data into one clean, filter-ready spreadsheet. Your task is to compile every pre-selling or RFO project you can find from the major developers that operate in my target markets—primarily Metro Manila (with emphasis on Quezon City, Manila, Pasig and Valenzuela) plus key growth hubs in North Luzon. For each project, capture the essentials I use when pitching to buyers: • Developer name and exact project name • Precise location/address • Project type (condo, house-and-lot, lot only) • Highlight amenities offered • Complete payment terms and a sample computation straight from the developer’s price sheet • Contract price range and reservation fee Pleas...

    $204 Average bid
    $204 Avg Bid
    43 bids

    I have a predefined list of topics and I need a methodical web-researcher to comb through the internet, identify credible organization sites related to each item, and capture every relevant online asset they host. My end goal is a clean, well-structured spreadsheet that I can tap for future research. Here’s what I expect: • For every topic on my list, locate organization websites that speak directly to it. • Record the full site URL, the specific page URL where the asset lives, the page title, and a one-line summary of why that page is useful. • If the page offers downloadable material (reports, documents, images, videos, or any other internet asset), note the direct download link. No need to download the files yourself—just give me reliable links. •...

    $124 Average bid
    $124 Avg Bid
    21 bids

    I want a comprehensive, ready-to-use database of transport-related businesses that operate anywhere in New South Wales. The goal is roughly 15,000 unique, verified email contacts pulled from Google Maps, company websites or any other reliable public source you normally trust for web-scraping. The scope covers every organisation in these six categories: • Freight & Transport / Haulage / Trucking Companies • Courier Services & Delivery Providers • Bus & Coach Operators, Charters, Hire and School Bus Services • Logistics Services / 3PL operators • Taxi, Hire-Car, Rideshare Fleet Operators plus Car Rental & Hire Firms • Removalists and Furniture Movers For each listing I need the following fields, all separated into their ...

    $192 Average bid
    $192 Avg Bid
    72 bids

    Project Description: Find school districts and charter schools who use a specific vendor for a large list of domains. I am seeking an experienced web scraping specialist to improve our Python script to analyze a large list of school district websites (approximately 4000+ URLs) and identify the ones who show a specific link on any page found in their sitemap. The primary method of identification must be to scan the website's for specific, known vendor links. Deliverables Required 1. A Production-Ready Python Script (.py file): The script must be commented, easily configurable, and capable of reading the provided CSV list, performing the scan, and generating the output CSV. It should handle timeouts and basic error handling gracefully. 2. The Final Results (CSV/Excel File): A c...

    $237 Average bid
    $237 Avg Bid
    142 bids

    I need an .xlsm workbook whose VBA macro fetches product data from both and lowes.com. When I type a valid item or model number into a row, the code should automatically pull back: product name, full description, regular price, sale price (if available), brand, product type/category, and the main image (inserted into the sheet or stored in an Image column). I work comfortably with VBA, so a concise, well-commented routine is all I need—no step-by-step user guide. The workbook must stay self-contained, relying only on standard references such as Microsoft XML, HTML, or WinHTTP libraries; please avoid external add-ins or Python bridges. Deliverables: • Finished macro-enabled Excel file (.xlsm) ready to test with my own SKU list • Clearly commented VBA code so I can...

    $46 Average bid
    $46 Avg Bid
    53 bids

    I need assistance merging my current football dataset with a new one. This new dataset will be sourced from online scraping of weather and expected goals (Xg) data. Requirements: - Scrape data from official weather and football statistics websites. - Integrate the following weather data: temperature, humidity, and precipitation. - Work with datasets in Excel format. - Correlate this new data with historical football match data in my existing dataset. Ideal Skills and Experience: - Proficiency in data scraping and data manipulation. - Experience with Excel and handling large datasets. - Familiarity with weather and football data. - Strong analytical skills to ensure accurate correlation of datasets. Looking forward to your proposals!

    $294 Average bid
    $294 Avg Bid
    84 bids
    Selenium LinkedIn View Bot
    2 days left
    Verified

    I need a Selenium-based solution that runs reliably on Windows and opens Google Chrome to simulate human visits to LinkedIn (and occasionally other) profile URLs listed in a Google Sheet. For each URL the program should: • Pull the next unused link from the sheet • Load the page in Chrome, wait a random time between 20 seconds and 3 minutes • Apply truly randomized scrolling patterns while the profile is open so behaviour looks organic • Fire a webhook the moment the visit completes, passing back any ID or payload I define so our CRM reflects the touch instantly Configuration items such as Google Sheet ID, webhook endpoint, minimum/maximum dwell time, and daily visit caps should live in a simple file I can edit without touching code. A short README on installi...

    $244 Average bid
    $244 Avg Bid
    115 bids

    I need a freelancer outside the USA to gather some data and provide me with a code snippet. Ideal skills and experience: - Experience in data gathering - Familiarity with coding in Python, JavaScript, or Ruby - Ability to work independently and deliver accurate results Please provide details on your data collection methods and coding expertise in your bids.

    $222 Average bid
    $222 Avg Bid
    88 bids

    We are building a full internal marketplace analytics web system, not just a reporting script. The system is designed to combine competitive intelligence with internal sales and stock analytics in a single interface. Functional Requirements The system must provide the following capabilities: 1. Product and SKU structure - Each product must be split into individual SKUs based on flavor and volume. - All analytics and reports are built at the SKU level. 2. Our product analytics (primary focus) - Current stock levels (total and per SKU). - Sales volume for selected periods (daily / weekly / monthly). - Reorder recommendations based on stock thresholds and sales dynamics. - Revenue calculations per product and per SKU with period filtering. 3. Competitive analytics - Automated collection o...

    $3549 Average bid
    $3549 Avg Bid
    200 bids

    I have a sizable dump of customer records—names, contact numbers, email addresses, and a few extra fields—that must be transferred into a single, well-organized Excel workbook. I will send you the exact header template, so every column you create must match it precisely. Your task involves: • Importing the raw files into Excel (or Power Query, if you prefer) and mapping each entry to the columns I supply: Names, Contact Numbers, Email Addresses, and the additional fields. • Removing every duplicate without losing valid information. • Applying basic data-validation rules (drop-downs, text length limits, email format checks, etc.) so the sheet remains clean long after this project ends. • Consistently formatting phone numbers and email addresses, fixing...

    $15 / hr Average bid
    $15 / hr Avg Bid
    68 bids

    I have about 500 genuine customer testimonials sitting on another well-known review platform, and they belong on my Google Business profile instead. Every word has already been approved by the original authors, so no rewriting or polishing is required—I want them posted exactly as they appear now. Here is what I need from you: pull each review from the source link I’ll provide, publish it to my Google Business page without altering a single character, then give me proof that every post has gone live (a simple spreadsheet with the review text and a direct Google URL or timestamped screenshot is fine). Accuracy is crucial; I will cross-check that nothing has been omitted or modified. If you already manage multiple Google accounts or have an efficient, policy-compliant workflow ...

    $35 Average bid
    $35 Avg Bid
    18 bids

    Complete Lottery Prediction and Betting Automation System (Focused on Loterías y Apuestas del Estado - Spain) 2. System Features 2.1. Historical Data Collection and Update The system must automatically download complete historical results (drawn numbers, draw dates, prize breakdowns by category, accumulated jackpots) from the first draw of each lottery, directly from or reliable associated sources. Specific sources: Euromillones: (since Feb 13, 2004) La Primitiva: (since Oct 17, 1985 – modern version) El Gordo de la Primitiva: (since Oct 31, 1993) Updates automatic at exactly 00:02 the day after each draw, using ethical scraping (BeautifulSoup/Scrapy) with proper user-agent headers to mimic human behavior. Store data in PostgreSQL (structured) or MongoDB (flex...

    $2346 Average bid
    $2346 Avg Bid
    82 bids

    IM TYRING TO RUN THE ATTACHED JPNY SCRIPT TO GET INFO FROM A WEBSITE BUT I CANT UNDERSTAND IT DOESN'T WORK. I NEED THIS SCRIPT TO BE FIX + PAGINATION TO FETCH AROUND 2400 RECORDS FOR YELLOWPAGES I ONLY USE JUPYTER

    $31 Average bid
    $31 Avg Bid
    57 bids

    Recommended Articles Just for You