Closed Thread Icon

Preserved Topic: Nosey script. (Page 1 of 1) Pages that link to <a href="http://ozoneasylum.com/backlink?for=12918" title="Pages that link to Preserved Topic: Nosey script. (Page 1 of 1)" rel="nofollow" >Preserved Topic: Nosey script. <span class="small">(Page 1 of 1)</span>\

 
synax
Maniac (V) Inmate

From: Cell 666
Insane since: Mar 2002

posted posted 10-07-2003 00:22

I've been toying with this idea for a while, but I haven't succeeded in getting the script to do what I want.

At school all computer science students have webspace under the format http://torch.cs.dal.ca/~username/
However, not all the students use their webspace. What I'm trying to do is, write a script that looks at all the students' directories and then prints a link to their website if it exsists. If no website exists a 403 is returned.

I'm trying to do it in PHP and if it helps I have shell access and so does PHP (exec(), backticks and so on). Also, I can access the students' directories under the shell prompt and actually look in their directories to see if there is a public_html folder.

Does anyone have any thoughts as to how I can accomplish this? I can post the code I've written so far, but I doubt it would be of much assistance.



[This message has been edited by synax (edited 10-07-2003).]

bitdamaged
Maniac (V) Mad Scientist

From: 100101010011 <-- right about here
Insane since: Mar 2000

posted posted 10-07-2003 03:36

I'm a little confused exactly what you are trying to do.

Do you want a page with links to all of the students pages if they exist?
If that is the case then hopefully (and probably) you have all the student accounts in a single directory or at least a consistent directory naming convention then it shouldn't be hard to write a script that loops through all the directories within the parent directory to see if the directories have contents.

ie. Normally in this type of config you will have something like this

/homes/[user name]/public_html/

And you should be able to loop through all the directories in the /homes directory to see if there is something in [username]/public_html/






.:[ Never resist a perfect moment ]:.

Emperor
Maniac (V) Mad Scientist with Finglongers

From: Cell 53, East Wing
Insane since: Jul 2001

posted posted 10-07-2003 03:49

synax: I preusme this is the script you've mentioned before (pos. to me over ICQ). So what you want is a way to rattle through all possible directories and if it doesn't return a 403 (or 404?) you want it to print a link to that directory? Just to clarify as this isn't too clear:

quote:
What I'm trying to do is, write a script that looks at all the students' directories and then prints a link to their website if it exsists. If no website exists a 403 is returned.



___________________
Emps

The Emperor dot org

Tyberius Prime
Paranoid (IV) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 10-07-2003 09:30

well, if you're on the same server, and the security is this low,
look into PHP->opendir(), should tell you all you need to know to find out what directories are within another directory.

Skaarjj
Maniac (V) Mad Scientist

From: :morF
Insane since: May 2000

posted posted 10-07-2003 10:26

Why not pull up a database result set of all the computer science usernames into an array and cycle through it checking for any files in the directory?

synax
Maniac (V) Inmate

From: Cell 666
Insane since: Mar 2002

posted posted 10-07-2003 13:19

bitdamaged: Yes, they're all under a the same directory (/users/cs/).

Emps: Yah, same script, and yah, print all webpages that don't return a 403 or 404

TP & Skaarjj: What I'm doing is going through the /users/cs/ directory and writing all the names of the subdirectories (the usernames of the CS students) into a text file. Then I just read the names from the file to do the checks.

I'll post what code I have when I get back from my next class.



synax
Maniac (V) Inmate

From: Cell 666
Insane since: Mar 2002

posted posted 10-07-2003 15:21

The part that's commented out is the part that isn't working.

code:
<?php

$i = 0;
$path = "http://torch.cs.dal.ca/";
$abs_path = "/users/cs/";

$handle = fopen("csusers.dat", "r");
while (!feof($handle)) {
$users[$i++] = fgets($handle, 1024);
echo $users[$i];
}
fclose($handle);


/*while ($i < count($buffer)) {
if ($fp = fopen($path . "~" . $buffer[$i] . "/", "r")) {
echo "<a href=\"". $path . "~" . $buffer[$i] . "\">". $buffer[$i] . "<br />";
fclose($fp);
}
$i++;
}*/

?>





[This message has been edited by synax (edited 10-07-2003).]

bitdamaged
Maniac (V) Mad Scientist

From: 100101010011 <-- right about here
Insane since: Mar 2000

posted posted 10-07-2003 18:00

hmm..........

There's a couple of things wrong here.

First in this part
while (!feof($handle)) {
$users[$i++] = fgets($handle, 1024);
echo $users[$i];
}


Does that work? From what I can tell you are incrementing i after adding a username to it. so you should be basically doing
$users[3] = something
echo $users[4]

Which shouldn't display anything. (note most languages will let you do $user[++$i] which will increment before returning $i as the index).

Try
$users[$] = fgets... //etc
echo $users[$i]
$i++;

The second part won't work for a lot of reasons.
First the while statement will never be true because $buffer doesn't exist and $i is now your number of users. Presumably $buffer is supposed to be $users but you still need to reset $i to zero.

Just to clean this up get rid of all the $i stuff. you don't need it

this:
while (!feof($handle)) {
$users[$i++] = fgets($handle, 1024);
echo $users[$i];
}

can be this:
while (!feof($handle)) {
$users[] = fgets($handle, 1024);
echo $users[count($users)-1];
}

then in the second part instead of a while use a foreach

foreach($users as $var) {
// ... and just echo $var instead of $buffer[$i]
}


the second part that isn't working




.:[ Never resist a perfect moment ]:.

synax
Maniac (V) Inmate

From: Cell 666
Insane since: Mar 2002

posted posted 10-07-2003 19:16

Hmmm, apparently the code I posted above was an older version of the file. The latest version I had in the works (with Emps helping me over ICQ) was this one: http://torch.cs.dal.ca/~dion/act_dirs.phps

However, with bitdamaged's help, I've reduced the code to this:

code:
<?php

$i = 0;
$path = "http://torch.cs.dal.ca/";
$abs_path = "/users/cs/";

$handle = fopen("csusers.dat", "r");
while (!feof($handle))
$users[] = fgets($handle, 1024);
fclose($handle);


foreach ($users as $var) {
if ($fp = fopen($path . "~" . $var . "/", "r")) {
echo "<a href=\"". $path . "~" . $var . "\">". $var . "<br />";
fclose($fp);
}
}

?>



But as you can see it still doesn't work. It doesn't like me saying "if I can open this path..." if ($fp = fopen($path . "~" . $var . "/", "r"))



[This message has been edited by synax (edited 10-07-2003).]

quisja
Paranoid (IV) Inmate

From: everywhere
Insane since: Jun 2002

posted posted 10-07-2003 20:42

You can't normally use PHP file functions over http, i.e. you can't do fopen("http://www.domain.com/file.dat"). You have to reference the file as it is on the server's file system for example fopen("../~user"). I think...

bitdamaged
Maniac (V) Mad Scientist

From: 100101010011 <-- right about here
Insane since: Mar 2000

posted posted 10-07-2003 21:14

Actually you can use fopen over http. (or ftp or whatever schema you want actually)

This looks like it's throwing an error when it can't find the page. Sound's like an error reporting setting actually.

Try @fopen to turn off the warnings and see if it gives you a list of links



.:[ Never resist a perfect moment ]:.

butcher
Paranoid (IV) Inmate

From: New Jersey, USA
Insane since: Oct 2000

posted posted 10-07-2003 22:04

Synax

I'm not sure if this is exactly what you are looking for but you may be able to incorporate parts of it to your benifit. This bit of code will open the URL and check the return header. I only have code at the bottom to check for 404's but it can be easily expanded to check for any type of page returned.

I hope it helps.

code:
<?php

$url = "www.yoururl.com";

/* Remove http:// if it was on the front of the url */

$has_http = strtolower(substr($url, 0, 7));
if ($has_http == 'http://') {
$url_length = strlen($url);
$url = substr($url, 7, $url_length);
}

/* Separate the host from the local page address */

$url_bits = explode("/", $url);
$host = $url_bits[0];
$the_rest = sizeof($url_bits);
for ($i = 1; $i <= $the_rest; $i++) {
$local_page .= "/$url_bits[$i]";
}

$length = strlen($local_page);
$local_page = substr($local_page, 0, $length-1);//remove trailing backslash

/* Open the socket and get the headers */

$fp = fsockopen ($host, 80, $errno, $errstr, 30);
if (!$fp) {
echo $response .= "$errstr ($errno)<br>\n";
} else {
fputs ($fp, "HEAD $local_page HTTP/1.0\r\nHost: $host\r\n\r\n");
while (!feof($fp)) {
$response .= fgets ($fp,128)."<br />";
}
fclose ($fp);
if (strstr($response, "404")) {
//it's a 404
}
else {
//it's not a 404
}
}
?>



-Butcher-

krets
Paranoid (IV) Mad Scientist

From: KC, KS
Insane since: Nov 2002

posted posted 10-07-2003 22:09

Shouldn't the title of this page be:

"What web geeks do to meet girls"

:::11oh1:::

synax
Maniac (V) Inmate

From: Cell 666
Insane since: Mar 2002

posted posted 10-08-2003 00:37

krets: Obviously you haven't taken computer science...

Skaarjj
Maniac (V) Mad Scientist

From: :morF
Insane since: May 2000

posted posted 10-08-2003 00:44

Yeah! Comp. Sci. students aren't there to learn how to meet girls!


They're there to learn how NOT to meet girls!

« BackwardsOnwards »

Show Forum Drop Down Menu