From: Guy K H. <gu...@fo...> - 2003-09-30 17:05:22
|
Thanks Greg, that is good information. Before hearing back from you I had solved the "bleeding through" of the default context's content by making both sites virtual. I had thought it would be a disadvantage to have no default context until I looked at the request logs and noticed tons of 404's for "GET /". The requests are apparently directed to my numerical IP address, and they come in bursts of several per minute from seemingly random IPs, never twice from the same IP. Older logs reveal that this started happening in mid-august. I can only imagine that some worm is out there looking for default IIS index pages? They probably look inoccuous enough for a site that gets lots of traffic, but I don't get that much traffic so these stand out like a sore thumb. They always identify the client as MSIE 5.5; Windows 98. I mention all this in hopes that someone on the list might enlighten me about what might be generating these requests. But to my real question: I upgraded to Jetty 4.2.12, and now the symbolic links in my docroot are no longer being served. Is that by design, and is there a way to get new Jetty to follow symbolic links? Thanks -- Guy Greg Wilkins writes: > > Three virtual host answers... inline.... > > Guy K Hillyer wrote: > > Hi folks > > > > I have been running Jetty for several years now, and I'm just back on > > the support list after the recent spam unpleasantness. Recently, for > > the first time I added a virtual host that serves static content. > > > > I submitted the virtual host's URL to teoma/askjeeves. Their robot > > sent me a notice that my submission had been rejected because of the > > presence of a robots.txt resource. > > > > There is a robots.txt in my regular host's document root, but not in > > the root directory assigned to the virtual host. But I see that when > > I give the URL for <virtual host>/robots.txt, Jetty delivers the > > robots.txt from my regular root directory. > > > > How can I avoid this "bleeding through" of the regular site's content > > into the virtual site's? Here is the relevant xml > > You need to add a NotFoundHandler to the context that you don't want > to fall through from: > > > <Call name="addContext"> > > <Arg>/</Arg> > > <Set name="ResourceBase">/www/thepalacearcade</Set> > > <Set name="virtualHosts"> > > <Array type="java.lang.String"> > > <Item>thepalacearcade.com</Item> > > <Item>www.thepalacearcade.com</Item> > > </Array> > > </Set> > > <Call name="addHandler"> > > <Arg><New class="org.mortbay.http.handler.ResourceHandler"/></Arg> > > </Call> > <Call name="addHandler"> > <Arg><New class="org.mortbay.http.handler.NotFoundHandler"/></Arg> > </Call> > > </Call> > > > > <Call name="addContext"> > > <Arg>/</Arg> > > <Set name="ResourceBase">/www/docroot</Set> > > <Call name="addHandler"> > > <Arg><New class="org.mortbay.http.handler.ResourceHandler"/></Arg> > > </Call> > > </Call> > > > > My second question is, how can I separate request logging for these > > two web sites? > > You need to get a recent copy of jetty where I allow the request log > to be added to the context. 4.2.12 will do.... > > > > Sorry, I forgot already what my third question was. I'm sure it was > > very important. > > I think the answer is 42 > > cheers > > > > Best > > > > -- Guy Hillyer > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Jetty-support mailing list > > Jet...@li... > > https://lists.sourceforge.net/lists/listinfo/jetty-support > > > > |