Closed Thread Icon

Topic awaiting preservation: Advice needed on multiple queries/updates/inserts w/ MySQL and PHP Pages that link to <a href="https://ozoneasylum.com/backlink?for=26458" title="Pages that link to Topic awaiting preservation: Advice needed on multiple queries/updates/inserts w/ MySQL and PHP" rel="nofollow" >Topic awaiting preservation: Advice needed on multiple queries/updates/inserts w/ MySQL and PHP\

 
Author Thread
Pugzly
Paranoid (IV) Inmate

From: 127.0.0.1
Insane since: Apr 2000

posted posted 08-15-2005 22:39

Hoping some of you PHP/MySQL guys can help me.

I have a database with a bunch of tables. I have a script that gets some data from a provider and populates the related tables. I repopulate the tables once a day. Everything works fine.

I need to extend some functionality to track when records are new. This requires me to get the data daily, but only insert the NEW data (or update changed data), while retaining the existing (unchanged) data. I assign a time stamp to the records when they are initially inserted.

I'm trying to determine the best approach to retrieving the records from the provider and checking my database.

My issue is that I'm a little green on doing a crap load of stuff during MySQL queries. Currently, my thinking is that I need to cycle through the array of results from the provider, and:

- query my db for a record according to a common, unique field (not a problem, since this is in place).
-- if it matches, see if the fields I need to monitor match
--- if they do, do nothing and move to the next record
--- if they don't, update the record and timestamp
-- if it doesn't match, insert the new record and timestamp.

This means I'm making two mysql_query() calls for each record. Multiply that by the number of records I have (~900 now, but up to 30k later), and we have a LOT of queries going on.

I can deal with this if I need to, since it's only happening once a day. But, I thought I would ask and see if you guys can provide an example of a better method.

Note: Before the need to track everything, I would merely empty the tables each day and repopulate them. But, alas, my needs dictate that I can't do that any more.

Tyberius Prime
Paranoid (IV) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 08-16-2005 01:20

how about getting a whole block of the unique field + assorted fields ?

so you have $arrUniques = select (possibly matching fields) from table where uniqueField in ('value','another value',1,2,3...)
go through each result, check with your unique field, then check the matching fields, then do whatever you need to do.

Though with 30k records, you might be better of doing it one by one, less you run into memory problems (of course that doesn't apply if your possibly matching fields are small enough, and you have integer unique values...).
Alternativly, think up a good way to group this so you still need less select queries.

Pugzly
Paranoid (IV) Inmate

From: 127.0.0.1
Insane since: Apr 2000

posted posted 08-16-2005 03:28

I can group them. In fact, I was thinking of doing that.

The rows aren't that big, really, and I did think of reading the whole thing in, comparing/replacing, then writing the whole thing back. But I opted against that due to memory concerns. Even small rows add up when you're dealing with that many records.

Since I do have groups anyways, and each is under 1000 records, I may do it that way.

I'm still open to other ideas, though.

« BackwardsOnwards »

Show Forum Drop Down Menu