Topic awaiting preservation: searching using JavaScript? (Page 1 of 1) |
|
---|---|
Nervous Wreck (II) Inmate From: sj, ca, usa |
posted 03-13-2005 18:08
greetings inmates, |
Bipolar (III) Inmate From: London |
posted 03-13-2005 18:36
Do you mean client-side JavaScript, or server-side JavaScript? |
Paranoid (IV) Mad Scientist with Finglongers From: Germany |
posted 03-13-2005 19:24
anyhow, having the client download your pages before searching through them is going to be so slow, it's not even going to be funny. |
Nervous Wreck (II) Inmate From: sj, ca, usa |
posted 03-14-2005 06:15
I could always use another language, but in the interest of this project and there was a need for it to be very simple, very easily manageable, and usable without the need to install any application or web servers to get it going. that, and i don't have the technical know-how to do it in perl/cgi. |
Paranoid (IV) Inmate From: France |
posted 03-14-2005 14:42 |
Nervous Wreck (II) Inmate From: |
posted 03-14-2005 16:36
I recall that I made a script (long time ago) that scanned through the .htm files, and builded a tree,which was saved like php datastructure. (for easy access). |
Paranoid (IV) Inmate From: Maryland, USA |
posted 03-15-2005 03:02
You could link them to Google for your site. |
Paranoid (IV) Inmate From: USA |
posted 03-15-2005 21:18
Client side is A) SLOW and B) Redundant. This means that each user has to record all the data of your site, each time they search, in addition to the inherant slowness of JS, which is a LOT of processor cycles, as opposed to having the server do it once and then giving all the data to the users. |
Paranoid (IV) Inmate From: France |
posted 03-15-2005 22:02
It's not really that JS is slow : if it really were we couldn't do a lot of the stuffs we do for the 20 lines JavaScript Contests. But it is redundant. Why would you send all the pages of your site to the client while they are stored on your server and that server side script can do a search on them fairly easily and quickly ? |
Nervous Wreck (II) Inmate From: |
posted 03-16-2005 13:13
hmmm...It might be slow, but I pretty sure that It will work for more than 10 pages, but to prove it I guess I have to implement it ,but my supervisor have instructed me to not do ANY recreational programming before my master thesis is finished |
Paranoid (IV) Inmate From: France |
posted 03-16-2005 14:29 |
Nervous Wreck (II) Inmate From: sj, ca, usa |
posted 03-17-2005 00:56
The client side solution i am proposing basically just has the client build up an array of filenames and keywords. Then the user just needs to compare against the array of keywords, then return the filename associated with it. Then there is no need to crawl through the pages themselves. So far, it's pretty fast. I don't believe that this engine will slow down dramatically when comparing through, say, an array of 100 elements. |
Paranoid (IV) Inmate From: USA |
posted 03-17-2005 07:44
quote:
|
Maniac (V) Mad Scientist From: 100101010011 <-- right about here |
posted 03-18-2005 19:31
quote:
|