Find Jobs
Hire Freelancers

Create a Script which Aggregates Listings From 4 Different Forums into My Wordpress Website

$30-250 USD

Closed
Posted over 8 years ago

$30-250 USD

Paid on delivery
I'm looking for a web developer who can create a custom script or WordPress plugin which will: 1) Scrape specific data from 4 different forum websites on a daily basis (listings of specific animals for sale) 2) Add/Update those listings, properly formatted into my WordPress site (via the Easy Digital Downloads Plugin) on a daily basis 3) Ensure listings are posted into the correct category corresponding to the forums where they were pulled. 4) Delete listings which have been marked as sold, or removed. Note: I have created an account on my website for each of the 4 forums we're scraping. I would like all postings to be posted from the corresponding/proper account. These must match up for consistency. The data needed to be pulled and properly formatted for each listing: -Photo(s) of animal -Title of listing/post page -Price of animal -URL of post/listing page -Weight of animal (in grams) -Sex of animal -Description
Project ID: 8950194

About the project

10 proposals
Remote project
Active 8 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
10 freelancers are bidding on average $201 USD for this job
User Avatar
I can do this .
$155 USD in 3 days
5.0 (71 reviews)
6.7
6.7
User Avatar
A proposal has not yet been provided
$155 USD in 3 days
4.9 (120 reviews)
6.5
6.5
User Avatar
We have a good amount of experience in web scraping using Python,Django and nodejs. This is our latest project on web scraping using python: Scraping using Python: Electronics Parts Intelligence Processing eProductScrapper is mostly scraping & data-mining oriented project, which is based on scrapy and lxml plugins, along with Celery distributed environment via redis. This is mostly focused on electronics parts to fetch information like product details, sku, technical datasheet(pdf), product stock, price history. which will be used to make product life-cycle in a highly presentable manner to make non-authorized seller, brokers, after market sellers more aware of the market requirements of the products. Technology & Framework Used: Python, django, celery, scrapy, nodejs, mongodb, mysql. We would love to have ongoing relationships with your team and ready to work on your time schedule 40-50 hrs per week as per requirements. Thanks & Regards
$147 USD in 5 days
4.9 (18 reviews)
6.6
6.6
User Avatar
Hello Dear, HERE Ready to start!!! I can perfectly do it and interested for this project. I have lot of experience related and different types of projects. I am waiting to your quick positive reply. Thanks
$150 USD in 3 days
4.6 (88 reviews)
5.9
5.9
User Avatar
A proposal has not yet been provided
$250 USD in 3 days
4.8 (17 reviews)
4.9
4.9
User Avatar
Hi I'm an expert on all data entry and scrapping jobs so I can do this perfectly and I have lot of experience in this type jobs. waiting for your reply to start Regards Sarika
$111 USD in 3 days
5.0 (12 reviews)
3.4
3.4
User Avatar
I've got plenty of experience in WordPress, PHP site-scraping as well as plugins, so this will be a pretty easy task for me. Let's go scrape stuff!
$250 USD in 5 days
5.0 (15 reviews)
3.1
3.1
User Avatar
hello we have before this can kind of work. I have 5 test of experice and I am sure I will provide you 100 percent satisfactions. ..
$111 USD in 5 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of UNITED STATES
Hingham, United States
5.0
41
Payment method verified
Member since Aug 19, 2012

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.