|
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-06-2004 22:22
I have a table which matches virtual urls to pageids...moving from an "index.php?pid=xxx" system to a "/support/articles/etc/" system.
I have urls which look like this:
/support/articles/nameOfArticle/
/support/downloads/patchNumber/
/support/resources/nameOfResource/
etc.
There are potentially unlimited number of 2nd level groups (articles, downloads, resources, etc).
When someone goes to the /support/ page, I want them to see an index of all the different kinds of support (articles, downloades, resources, etc) they could access.
I'm trying to write an sql select statement which will select one result for each of the existing second level items...and I'm at a total loss about how to GROUP BY and LIMIT that.
Can anybody offer any help?
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-06-2004 23:17
mobrul: I see - I'd need more information I suspect but what I assume you want is the support page to have:
Articles
-nameOfArticle
Downloads
-patchNumber
Its doable but I'd need to know the table setup and what criteria you are using for getting that one result from the third level. I have a bit of SQL I use that might be handy - I'll dig it out.
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
(Edited by Emperor on 06-06-2004 23:56)
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-06-2004 23:27
I use this rather complicated SQL:
quote: $sql = "SELECT urt.q,
urr.rank,
urtl.u_ranker_in_count
FROM u_ranker_pages AS urp,
u_ranker_terms_link AS urtl,
u_ranker_terms AS urt,
u_ranker_rank AS urr
LEFT JOIN u_ranker_rank AS urr2
ON urr.link_id = urr2.link_id AND
urr.date_updated < urr2.date_updated
WHERE urr2.link_id IS NULL AND
urp.page = '$ranker_id' AND
urp.page_id = urtl.page_id AND
urtl.terms_id = urt.terms_id AND
urr.link_id = urtl.link_id
GROUP BY urr.link_id
ORDER BY urr.rank, urtl.u_ranker_in_count DESC, urt.q";
to get the most recent result for a specific search term on pages like this:
http://www.the-emperor.org/con_cat/tools/u_ranker.php?ranker_id=/
The crucial bit is the LEFT JOIN on a copy of itself and the:
code:
urr.date_updated < urr2.date_updated
WHERE urr2.link_id IS NULL
This basically makes it compare the two copies until it hits a result on oneside which is NULL and so the result returned must be the most recent.
If that seems to be what you are after then give me more details we can work towards concoting something that should suit you.
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
(Edited by Emperor on 06-06-2004 23:27)
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-06-2004 23:30
[edit: sorry, posting at the same time. Give me a moment to see what this is.
Thanks.]
-rest deleted cuz it doen't make sense in light of emp's post there. -
(Edited by mobrul on 06-06-2004 23:32)
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-06-2004 23:52
OK this is close.
What I want is sql which does something like this:
code:
$allTypes = array();
$sql = "SELECT url FROM pageid_url_index WHERE url LIKE '/support/%' GROUP BY url";
$qry = pg_query($db,$sql);
while($row = pg_fetch_array($qry))
{
$url = $row['url'];
$thisType = explode('/', $url);
$thisType = $thisType[2];
if(!in_array($thisType,$allTypes))
{
array_push($allTypes, $thisType);
}
}
Now, at the end of this while() loop, I'd have an array $allTypes which contains the names of all the types of support one could get.
I want a select statement that will do all of that in the selecting.
When the dataset is small, there is no big deal w/ the php method...but my data set is not small. =)
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-06-2004 23:58
mobrul: With that method you are pulling out all the results and making PHP do the work which will be a lot slower than using MySQL to crunch the data down for you.
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-07-2004 00:03
I know.
That's why I said "When the dataset is small, there is no big deal w/ the php method...but my data set is not small. =)"
What I need is some sql that will accomplish the same task that php does...but faster and more resource efficient.
Does that make sense?
(Edited by mobrul on 06-07-2004 00:05)
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-07-2004 03:05
mobrul: OK I understand - thats what my SQL is designed to do so, as I said, if you want to do something like this provide more details on structure, etc. and we'll see what we can put together,
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-07-2004 03:19
That might be half the problem. There are two tables.
One (older) table looks like this:
PAGES
--------
pid
title
description
content
and the other (newer) table looks like this:
PAGE_URL_INDEX
-----------------------
url
pid
That's it. Nothing more.
Unfortunately I've been handed a mess.
The discussion groups, the marketing pages, the support pages, the class registrations...EVERYTHING...is a record in the db with no way to distinguish one from another...except knowledge and hardcoded links to "index.php?pid=295" buried in each page.
With a small army, I've managed to assign a url to each pid. That's all I have.
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-07-2004 04:03
mobrul: Ahhhh I see thats not good. What criteria were you going to use to slect which single 3rd level item went with the second level on the support page?
If you are continuing to add material you could stick in a timestamp or date field and you could then use that to sort by so that the most recent would show.
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-07-2004 04:38
I wasn't going to use any criteria. I was hoping to use that list (of 2nd level items) as a layer of (dynamic) navigation. Then, choosing the 'articles' link, for instance, would produce a list of articles. (That's the easy part...)
I was also going to (eventually) allow a user to not only select a type of support, but a product. At that point, one would get an index of all the support available for that product, regardless the type. (That obviously requires a further level of referencing not yet in existance...too much work, not enough time...story of my life...).
You see my dilemma?
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-07-2004 04:52
mobrul: I suspect I have misunderstood the problem - sorry.
[edit: Blame an excess of Chinese food if you like ]
If I were you I'd try and restructure the data as you are going to be causing yourself heartache for no good reason. If I were you I'd run a script that worked through the entries (if you have a lot of results you'll need to probably step through them 50 odd at a time) - got the results out split them and then put them back in two different fields. A smallish amount of work fixing your data will really make things faster and more efficient.
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
(Edited by Emperor on 06-07-2004 04:53)
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-07-2004 05:05
damn chinese food... =)
That's what I was afraid of.
For the record, if one figures in discussion groups (not widely used, thank the chicken gods or the poor marketing skills of my predecessor) I have well over 2k records/pages in that db.
Thanks anyway.
|
Emperor
Maniac (V) Mad Scientist with FinglongersFrom: Cell 53, East Wing Insane since: Jul 2001
|
posted 06-07-2004 13:01
mobrul: Yep with 2k records it will need to be stepped through or it'll time out. It strikes me as being the best solution and you may find that you need to do it anyway in the future so I'd do it now - its fairly straightforward (I have some bits and bobs around here that could help) but its your call.
___________________
Emps
The Emperor dot org | Justice for Pat Richard | FAQs: Emperor | Site Reviews | Reception Room
|
mobrul
Bipolar (III) InmateFrom: Insane since: Aug 2000
|
posted 06-07-2004 15:00
This is such a cf I'm tempted to rebuild the entire data structure.
When I was assigning urls to pids, I built a little command line tool which accepted a single argument (the pid on which the user would like to start).
It queried the db and:
- printed the title and pid
- opened the page in a browser
- promted for a url at STDIN
It checked for duplicates, but no fancier error checking than that.
Also, upon input, it checked the db connection and re-connected if it timed out.
(For some reason I never bothered investigating, if one would let it sit for 30 min or so, it would time out and not reconnect. As long as one entered a url within the 20-30 min time period, one could work literally all day long.)
That seemed to work really well. I'll probably build upon that.
As a phase 3 part of this rebuild, I was thinking of building a web maintenance gui application w/ php and gtk. Maybe that'll get moved up to phase 1.
You can be sure I'll post back here when...er...if I run into any more difficulties.
Thanks
|