On 3/25/07, Julien Lecomte <julien@...> wrote:
> > This seems to be a better way to get the contents from SF in current
> > form. My extension is more complex - it is a rip-off from another news
> > aggregator of mine stripped down to handle only SF feeds. That's why
> > extra database and synchronization layers.
> I've looked at it, and it's overkill. My current plugin is less than 5
> lines in php, doesn't require DB changes or connection, etc...
> You also gave just sourcecode, no example, no sample website to show
> that it works.
I doubt this list archives are not exposed to internet and I really do
not want any spam bots to spam this test wiki, so a little
http://wiki + rainforce + org/index.php?title=Special:SourceForgeNews
> My current sandbox method will not work as-is on SF because of SF
> limitation my current host does not enforce. This limitation will also
> block your script:
> On SF, we'll have to use the current cron/wget method which means that
> the page will only be downloaded every hour (cache problem solved.)
SF doesn't allow outgoing connections even for its own site if I am
not mistaken. It is easy to check though. Cron job were also turned
off last time I checked them some months ago. So I've setup an
external aggregator, which is triggered by users via an image linked
to external site. News are then fed through XML-RPC interface. This
technique is used in farplugins project here, on SF.