Preserved Topic: PHP-driven database Hard copy/archive question (Page 1 of 1) |
|
---|---|
Paranoid (IV) Inmate From: Minneapolis, MN, USA |
posted 03-06-2001 23:04
Since I started working on an events database, I have started seeing more and more great uses for my PHP/MySQL database, including but not limited too automatically generated text link menus and the top and bottoms of pages. |
Maniac (V) Mad Scientist From: Rochester, New York, USA |
posted 03-06-2001 23:12
I do not think that the process of making a static page would be a good idea. No matter what, you would need to check the database. Because of this, it is just as faster to grab the data and place it in the document dynamically that to check the page and if it is different rewrite the page. The other problem in the recreation of the page is that you would have to work with locking the database when a check occures because 2 concurent requests to a page that was not updated would result in double the amount of work, and time. |
Maniac (V) Mad Scientist From: Belgrade, Serbia |
posted 03-06-2001 23:25
Writing your menus to static HTML files can save some CPU cycles and here's my suggestion how to do it... You probably have an admin-like script that you use to add/remove menu items, right? Well, when you use that script, besides adding data to db it should also create those static HTML files (which you're going to include in your pages) at the same time. |
Paranoid (IV) Inmate From: Minneapolis, MN, USA |
posted 03-06-2001 23:29
Hmmm, good point. So the most efficient way would be to generate text files with the menus at the time of modification, then just insert the text. This would require a lot of text files (since I want each page to have it's own link disabled and the text bold), but I could put them in a sub-directory to keep them out of the way. |
Maniac (V) Mad Scientist From: Belgrade, Serbia |
posted 03-06-2001 23:46
If your pages are going to be updated every two weeks, then I see no reason to use PHP. Even if you create static HTML files, including them using PHP will take some CPU time, because everything else from that page (HTML code) has to be echoed, so using SSI might be more efficient. Also, there's another option, you can write a perl script which will be called from crontab and will generate complete HTML pages with everything included (something like pre-processor). |
Paranoid (IV) Inmate From: Minneapolis, MN, USA |
posted 03-08-2001 22:27
Wait, explain to me how including with PHP requires 'everything else to be echoed' while SSI doesn't. Both methods require the entire file to be parsed before serving it, but I'm assuming you mean PHP has some overhead here due to the fact that it has more possibilities. I can't imagine it would be much slower, but it probably doesn't matter since our server has never had to handle more than 1000 hits a day. |
Maniac (V) Mad Scientist From: Belgrade, Serbia |
posted 03-08-2001 22:42 |
Maniac (V) Mad Scientist From: Rochester, New York, USA |
posted 03-10-2001 16:48
So would it be better to use an <!--#exec to call a perl script to handle whatever rather than to use PHP? |
Paranoid (IV) Inmate From: Minneapolis, MN, USA |
posted 03-13-2001 18:48
Max is the expert. But here's my take. If it works and you are not having server slow-downs then this is fine. If you need more speed then making the switch to perl scripts will increase performance. If perl isn't fast enough then use java servlets. If java is too slow then use C scripts. If C is too slow then program it in assembly. |
Maniac (V) Mad Scientist From: Belgrade, Serbia |
posted 03-13-2001 23:16 |
Paranoid (IV) Inmate From: Minneapolis, MN, USA |
posted 03-16-2001 23:37
I'd be interested in knowing how it compares to Java servlets in terms of speed... I suppose it would depend on the application. |
Maniac (V) Mad Scientist From: Belgrade, Serbia |
posted 03-16-2001 23:48
Take a look at http://perl.apache.org/ for more information. |