"it would be even possible to code an Akregator web frontend to access your=
feeds from any machine (volunteers are welcome ;-) )." (akregator blog)
I really liked the idea of a web frontend, so i decided to think about it (=
asked Frank to help me ;-) ).
Now there are several problems that came to our minds:
=2D What should be stored on client-side, what on server-side?
=2D When to synchronize?
=2D Who fetches feeds, server or clients?
=2D Which technology should be used? (PHP5, Perl, ...)
=2D Which protocol? (XML-RPC, ...)
=2D how to minimize traffic?
Anyone else interested in a web frontend?
BTW: There is a wiki page where some ideas are collected:
as the new archive backend gets in shape (http://akregator.sf.net/blog), we=
can start thinking about the details of WebAkregator. Depending on the actu=
implementation our archive interface may need major changes, but it should =
As S=E9bastien is interested in the project, and Eckhart is interested in c=
the web frontend, I hope we get some useful discussion here. As I have no=20
experience in developing web applications, I leave the first comments to=20
=46or now I just dump the issues which came to my mind:
* What should be stored on client-side, what on server-side=20
* When to synchronize
* Who should fetch feeds, server or clients?
* Which technology should be used
* how should the protocol look like
* how to minimize traffic
well, Frank and I posted at nearly the same time, so forget my post and=20
continue here please. ;-)
Am Montag 11 April 2005 18:03 schrieb Frank Osterfeld:
> For now I just dump the issues which came to my mind:
> * Who should fetch feeds, server or clients?
My opinion is that the server should fetch the feed. This makes it very=20
unlikely that you miss any feed (I personally know a feed where staying awa=
from the computer one day has the impact that you missed something out).
The next point is that the feed archive should be used by multiple clients.=
the clients fetch the articles, they may fetch it simultaneously if they ar=
running Akregator at the same time. This would make feed providers unhappy=
Thirdly, there is a problem with archive distribution. If the feeds are=20
fetched by one client, they have to be uploaded to the server anyway to mak=
sure it is available there for other clients. P2P sharing beetween two=20
clients seems to be unpractical.
> * how to minimize traffic
The client always caches the articles. It asks the server whether new artic=
are available. The server responds with what has changed since the last cal=
The client asks the server to send him the missing articles, and deletes th=
expired articles from his cache.
Additionally: Security =E2=80=93 yes, I take this seriously ;-)=20
Client-side: The client must not react badly on whatever the server sends t=
Server-side: The server must not react badly on what a client sends to it, =
on what the feeds he fetches contain. Be aware of CSS inside the web=20