Howdy!
Okay - seems I've run into a minor issue here. http://www.macombsheriff.com/dev/get_psor.phps is the source. It grabs data from a state database and dumps them into mine. It runs via cron once a day.
Seems the state database (http://mipsor.state.mi.us) is throwing 500 errors (at least since this morning). Well, the way we're doing this is getting all of the data each day. So, when we go to toss it into the database, we empty the tables of the existing data, then insert the new data. You'll see near the top of the maxParseZIP($zip) function that I send an email saying it didn't update, then, after the 'exit' (where I 'assume' it's safe to continue), I empty the tables. This has worked ok if the site is down (inaccessible), but this doesn't work for 500 errors. Now it empties the tables, and, since there is no new data, leaves them empty.
Any recommendation on how I can verify a little better? I'd rather have the old data (yesterday's), than to have NO data. Up until now, it's worked great.
And on a unrelated note, the $ZIPs array is hardcoded near the bottom. I'd rather pull it from an existing table in the db (I reuse that table elsewhere on the site and it is the most complete list). What's the best way for 'selecting' the data and then using it in the script?
Thanks!