this is probably a noob question

  • nomb

    nomb - 2007-06-23

    Hello everyone,
    I can't seem to get my snoopy to pull images.  How is this accomplished?
    You can view my page at

    Thank you in advance, i'm seriously pulling my
    hair out...

    Thank You,

    • McDope

      McDope - 2007-06-24

      Can you please show your code?
      maybe i can help but without seeing your code this is really hard ;-)

      BTW: Your spider.php is down (404)

    • nomb

      nomb - 2007-06-25

      First off,
      Thank you for helping me. 

      Secondly, I had gotten frustrated and took down snoopy but have now remade it in hopes you can help me.  The new link is:

      What I am trying to do is just have it act as a proxy.  But I can't get the images, or when you use anyhing on the page I can't get it to stay  through mine.  (like logging into myspace or doing a google search)

      Here is the code, don't laugh I found it on the net:

      /*You need the snoopy.class.php from\*/


      $snoopy = new Snoopy;

      $url = $_POST['url'];

      // need an proxy?:
      //$snoopy->proxy_host = "";
      //$snoopy->proxy_port = "8080";

      // set browser and referer:
      $snoopy->agent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)";
      $snoopy->referer = "";

      // set some cookies:
      $snoopy->cookies["SessionID"] = '238472834723489';
      $snoopy->cookies["favoriteColor"] = "blue";

      // set an raw-header:
      $snoopy->rawheaders["Pragma"] = "no-cache";

      // set some internal variables:
      $snoopy->maxredirs = 2;
      $snoopy->offsiteok = false;
      $snoopy->expandlinks = false;

      // set username and password (optional)
      //$snoopy->user = "joe";//$snoopy->pass = "bloe";

      // fetch the text of the website
      // other methods: fetch, fetchlinks, fetchform, fetchlinks, submittext and submitlinks       
         // response code:   
         //print "response code: ".$snoopy->response_code."<br/>\n";       

         // print the headers:       
         //print "<b>Headers:</b><br/>";   
         //while(list($key,$val) = each($snoopy->headers)){       
         //   print $key.": ".$val."<br/>\n";   
         //print "<br/>\n";       

         // print the texts of the website:   
         //print "<pre>".htmlspecialchars($snoopy->results)."</pre>\n";   
         echo $snoopy->results;
         echo "<br><br>";
         echo "<a href=''>Back to - spiders</a>";
      else {   
         print "Snoopy: error while fetching document: ".$snoopy->error."\n";

      Again thank you for helping me I'm looking forward to hearing from you.


      • McDope

        McDope - 2007-06-25

        if you want to use snoopy as a proxy you need to do a bit more.

        method a) you just fetch the page and echo the results 1:1 -> Results in the normal page source to be interpreted by your browser and you will see the website almost like normal BUT the images are loaded from the original ip the user has since his browser will request them so you will have no anonymous proxy

        method b) you fetch the page, get each image-uri from the source, save these images to your server and echo a modified version of the result in what you replaced all image uris with the equivalents on your server.

        both methods will not bring you a real proxy, but i don't know what your intention is so... ;-)

        of course you will need to handle some cookies also if you want to be logged in to myspace or other sites with login, if you want i can give you a example script that i build for another guy. in this script the workflow of the login process is demonstrated.

        sorry for my bad english, just my 2nd language ;-)


    • McDope

      McDope - 2007-06-25

      i forgot something...

      also you will need to change each internal site link uri in the fetched source with a link to your script with the target uri as a parameter, elsewhere if you click a link you will get to the real site, not the "proxy-ied" site.

    • nomb

      nomb - 2007-06-26

      The kind I need is with the full page, images and everything cached on my server.  I would be very interested to see the example you made.  And also thank you for your help.


      • McDope

        McDope - 2007-06-26

        I put up the login example at my private space ->

        It logs on to ogame, a browsergame, and get & echo one page after login.

    • nomb

      nomb - 2007-06-26

      I am a little confused as how I can use this to make what I am needing.  Also when the page loads and you click on a link, you get an error saying the page isn't there.  What I need is when you load the page through snoopy, you can click on any links and everything and snoopy will keep working.

      Thanks again,

      P.S.  What is that page btw?

      • McDope

        McDope - 2007-06-26

        > "I am a little confused as how I can use this to make what I am needing. "
        you can't use that to make what you want. as i said: it's just an example how to handle cookies and session ids for logging in somewhere what you will need if you want to support sites with login in your "proxy".

        > "What I need is when you load the page through snoopy, you can click on any links and everything and snoopy will keep working. "

        then do it, as is said: you will just need to fetch the page, eg. and replace all link uri with a uri to your "proxy"-script with the real link as parameter.

        > "P.S. What is that page btw?"
        it's a game that will be player over a webbrowser, nothing special or good. just an example that i done for another guy.

  • Anonymous

    Anonymous - 2011-05-30

    I'd appreciate you to send me that example to cuentanumerouno at hotmail dot com. English is my second language too. Yours is fine, though.

    I made the script and now I have to login first to that site, and I don't know how to get-pass-create-copy the cookies for that website to serve the requested page (see me logged in)



Log in to post a comment.