Topic: How do you monitor websites? Pages that link to <a href="https://ozoneasylum.com/backlink?for=27675" title="Pages that link to Topic: How do you monitor websites?" rel="nofollow" >Topic: How do you monitor websites?\

 
Author Thread
Tyberius Prime
Maniac (V) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

IP logged posted posted 03-23-2006 08:19 Edit Quote

These days, the company I work for has a couple dozen customers,
and it's getting impossible to regulary check all their websites for 'still up, host didn't screw up', etc.

So I'm wondering, who of you is using any kind of software, or hosted service, to monitor her (or her customers)
websites for breakdown and possibly, changes.

What I'd be looking for feature wise is of course minimal setup costs as well as the abilitiy to check a special page that
would use whatever server side language to check the database.

Any ideas?
So long,

->Tyberius Prime

PS: I'm trying to find such things via google, but there are apperantly a whole beach of them...
PPS: Web based interface and installation on a server are of course requirements for any solution we'd have to take care of ourselves.

(Edited by Tyberius Prime on 03-23-2006 08:19)

(Edited by Tyberius Prime on 03-23-2006 08:25)

JKMabry
Maniac (V) Inmate

From: raht cheah
Insane since: Aug 2000

IP logged posted posted 03-23-2006 09:18 Edit Quote

I'm not clear if you're talking about servers that you have root admin or shared in different locations or what but have a look at nagios in the meantime

DmS
Maniac (V) Inmate

From: Sthlm, Sweden
Insane since: Oct 2000

IP logged posted posted 03-23-2006 13:54 Edit Quote

We have a lot of this running.
From database & processor statuses/load which requires installs on root level to simpler curl to a webapi to retrieve answers/data to check. To monitor these we use nagios that lists all the watchpoints and alerts on different levels when there is a problem.

Basic checks can also be done by scheduled curl to the site and some simple regexp, to parse for something you know should be in the beginning or end of the source of the index page.

Even if that's very basic should at least tell you if the page is rendering at all.

For more intricate functionality you could set up a SMAPI for the site that you can curl to, it can be as simple as checking a count in the database and return 1 or 0 if it has imcremented as expected.
You get the idea.

/D

{cell 260} {Blog}
-{"Theories without facts are just religions...?}-

(Edited by DmS on 03-23-2006 13:57)

GUisle
Obsessive-Compulsive (I) Inmate

From: Guam Isle
Insane since: Aug 2011

IP logged posted posted 08-02-2011 04:00 Edit Quote

i think they have software for those things. there's a daemon for logging system usage & an inclusive logger in each daemon. apache should have a log in /var, i think.

Once the days are old, you'll never go back.

chemlight
Neurotic (0) Inmate
Newly admitted

From:
Insane since: Aug 2011

IP logged posted posted 08-04-2011 19:50 Edit Quote

I've used Nagios and Ganglia. Nagios is a real time monitoring system. You can tie it in with a text-messaging or call system, and get notified if something goes down. Ganglia is used for historical graphing, and is great to review if something does go down.

They do take a little bit to setup initially, but are well worth it. With Nagios, one of the things you'll want to look into is Nagvis - it will lay out all of the information in a clean, easy to understand way - you will know what is up and not up.



Post Reply
 
Your User Name:
Your Password:
Login Options:
 
Your Text:
Loading...
Options:


« BackwardsOnwards »

Show Forum Drop Down Menu