We have a currently working script that gathers public record data.
It is a WWW::Mechanize screen scrape that has all the functionality that we need.
The site that it scrapes is very unreliable and often times out and gives errors. This scrape needs to be "hardened" so that no matter what happens it never skips any rows, nor days.
Most of this job is error checking and waiting for it to run. Solving the problem will not be technically difficult.
The command line is simply: /usr/bin/perl [login to view URL]
It will start on the current day and run backwards...
Right now the sql goes to stdout at line 156, $data is a hash reference to the information retrieved.
When results are complete, send a dump of the results for review. For testing purposes, no more than 2 months' worth of data will need to be reviewed.
Please bid for an under 5 day completion.
PayPal upon completion, or ikobo/moneybookers when PayPal not available.