This will be an easy job for an experienced PHP programmer.
We have a listing of over 100 public domain sites. We want a system that will scrape these sites for information based on keywords we specify.
1. User will install software on their own server.
2. User will log into an admin panel and define settings (MySQL, keywords, notification email addresses).
3. User will create tag based templates in this admin panel.
4. User will click a button to "harvest' the content.
5. The system will scrape these public domain sites and store the results in the MySQL db.
6. Once harvested, the data will show in the templates where the tags were specified.
I have a video of a similar system I can show serious programmers on request.
## Deliverables
1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
2) Deliverables must be in ready-to-run condition, as follows (depending on the nature of the deliverables):
a) For web sites or other server-side deliverables intended to only ever exist in one place in the Buyer's environment--Deliverables must be installed by the Seller in ready-to-run condition in the Buyer's environment.
b) For all others including desktop software or software the buyer intends to distribute: A software installation package that will install the software in ready-to-run condition on the platform(s) specified in this bid request.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the coder's Seller Legal Agreement).
## Platform
Linux
PHP
MySQL