Closed Thread Icon

Topic awaiting preservation: loading pages (Page 1 of 1) Pages that link to <a href="https://ozoneasylum.com/backlink?for=25469" title="Pages that link to Topic awaiting preservation: loading pages (Page 1 of 1)" rel="nofollow" >Topic awaiting preservation: loading pages <span class="small">(Page 1 of 1)</span>\

 
Blacknight
Bipolar (III) Inmate

From: INFRONT OF MY PC
Insane since: Dec 2001

posted posted 04-09-2005 22:32

sorry for the vague topic didnt know how to name it

i was wandering if possible

on my site i read out the menu dynamicly out of my db.
i am not using frames. now on each page i would have to read out the menu new. is there a way to read it out once and telling the other pages to use this menu?

not sure if you get , if not ask ^^

thanks

sprry wrong forum can someone send it over to serverside scripting thx

(Edited by Blacknight on 04-09-2005 22:32)

Skaarjj
Maniac (V) Mad Scientist

From: :morF
Insane since: May 2000

posted posted 04-09-2005 23:10

You're talking about caching the menu contents. I think that this process can be better explained by the other inmates. For now I shall move this to the correct forum.


Justice 4 Pat Richard

H][RO
Bipolar (III) Inmate

From: Australia
Insane since: Oct 2002

posted posted 04-12-2005 02:55

Hmm even if you did do this wouldnt it essentially be doing the same thing as reading the menu out for the next page. It still has to draw it again even if was somehow from cache, so you might as well continue as you were doing.

Not sure if I am getting what your asking right?

Whats the reason you want to do this for? your overall aim?

Blacknight
Bipolar (III) Inmate

From: INFRONT OF MY PC
Insane since: Dec 2001

posted posted 04-12-2005 09:19

yes but i would save on DB transaktion

i

DocOzone
Maniac (V) Lord Mad Scientist
Sovereign of all the lands Ozone and just beyond that little green line over there...

From: San Diego, California
Insane since: Mar 1994

posted posted 04-12-2005 09:47

I'd say that unless you're dealing with *lots* of hits per second or a *very* slow database server, you'd be better just reading the menu from DB each time. One of the more elegant solutions I've worked on involved an e-commerce site that really pounded the server, and the database specifically. What we did was make a back-end tool that would write the menu from the database once, and save it on the server as a static page. Every time the menu was updated, it would write a new file which would then be included using standard server-side technology. Since the menu was hardly ever changed, maybe once or twice a day tops, this eliminated a lot of DB chatter. Are you using PHP for your database calls? I could steer you in the right direction if you want to give this a try.

I'm pretty sure you could also make the menu data into a global variable, that only got processed once on the first load. Then you could generate the menu from this global variable (an array, I'd assume) when the other pages load. Again, I'll mention that this is a lot of work unless you're having problems already with page loading times due to database lags.

Your pal, -doc-

Blacknight
Bipolar (III) Inmate

From: INFRONT OF MY PC
Insane since: Dec 2001

posted posted 04-12-2005 11:34

i like your idea of creating a file each time the menu gets edited.
i use ASP.NET (vb) i'll give it a try

thx

hyperbole
Paranoid (IV) Inmate

From: Madison, Indiana, USA
Insane since: Aug 2000

posted posted 04-12-2005 18:09

Hi Doc!



.

-- not necessarily stoned... just beautiful.

Pugzly
Paranoid (IV) Inmate

From: 127.0.0.1
Insane since: Apr 2000

posted posted 04-12-2005 19:33

As Doc mentioned, I'd keep it in the db. If you're concerned about db activity, then you need to look at that as a whole, since it will impact other parts of the site as well. If there is a db performance issue, moving the menu away from the db will relieve you of some db chatter, but doesn't solve the overall issue - db latency & performance.

Perhaps the underlying code you're using to grab & manipulate the data from the db just needs to be optimized.

DmS
Maniac (V) Inmate

From: Sthlm, Sweden
Insane since: Oct 2000

posted posted 04-12-2005 22:40

You'd be surprised how much a database with correct indexes and sql-queries that use them can handle...

The community I've been involved in building now has some 370.000 members and a constant pressure of several thousand logged in and in some way active at the same time. the community http://www.pokerroom.com/pokah/ has been loadtested with some 50+ pageviews per second where the testcases simulate normal user actions generating around 100 db-selects/sec and 15-20 updates/sec... the database just purrs.

With proper indexing and optimized queries we managed to double the load 4 times before the single webserver gave up compared to where we started

Now with loads like this it's very important to also use caching, examples like Doc gave is a very good alternative. A webserver can serve a LOT more static pages than dynamic ones.

This said, there are some very good ways to kill the performance of a database.
Examples:
- Bad or nonexistent indexes or rotten queries that force the db to scan the whole table.
> use "explain select...." to see how the query actually works, then adjust.

- A query for every little thing on the page.
> Most of the time you can join tables to get more info in one query. Done smart, this means that you can cache things that change seldom and reuse it from page to page.

- Bad code... queries in a loop... That one's a killer!
> Avoid at all cost, if you absolutely cannot avoid it, generate a file from it that ONLY gets updated when the data changes. If that cannot be done. Back off and rethink!

This is in no way the whole truth, just some decent starting tips.
It should indicate that if you put thought in first you can probably get a normal site to perform very well and still be nice to the database.

Good luck and have fun
/Dan

{cell 260} {Blog}
-{ ?The Internet treats censorship as a malfunction and routes around it.? (-Barlow, John Perry) }-

H][RO
Bipolar (III) Inmate

From: Australia
Insane since: Oct 2002

posted posted 04-13-2005 00:29

Dont be fooled into thinking a text file is neceesarily faster then getting the data from the database, infact alot of the time this is not the case. Add too that having to parse the data in the text file etc and it really is simpler and easier to work with a db.

DmS
Maniac (V) Inmate

From: Sthlm, Sweden
Insane since: Oct 2000

posted posted 04-13-2005 10:07

Tried to edit, but post was too old...
this in my post:

quote:
generate a file from it that ONLY gets updated when the data changes.



shouild be replaced with:

quote:
generate a file with the rendered html snippet to include from it that ONLY gets updated when the data changes.


As H][RO said, reading & parsing a file is not necessarily faster than a db call, that's why you should complete the section and then include it, that will save you a lot more db & processor than parsing a textfile.
/Dan

{cell 260} {Blog}
-{ ?The Internet treats censorship as a malfunction and routes around it.? (-Barlow, John Perry) }-

poi
Paranoid (IV) Inmate

From: France
Insane since: Jun 2002

posted posted 04-13-2005 10:20

Is it necessary to remind the "funky caching" technique and "baked & fried" analogy exposed by Rasmus LERDOF, PHP's creator, in his PHP tips and tricks presentation.

Blacknight
Bipolar (III) Inmate

From: INFRONT OF MY PC
Insane since: Dec 2001

posted posted 04-14-2005 15:04

I don't actualy "need" to cut down in database traffik or so. i was joust trying to get round having to load data, that is the same on each page, again and again, it joust looks to me like a waste of recourses to having to do it. so i started to wonder how to come past it.

there are some good ideas round here so i'll try some and find out wich will work best for me. global variable(array) sounds good seperate generated html file sounds good to.

so thank for the ideas.

« BackwardsOnwards »

Show Forum Drop Down Menu