[asio-users] Server/Client-Server model
Brought to you by:
chris_kohlhoff
From: Gonzalo G. <gga...@gm...> - 2012-11-27 17:27:56
|
I have a program that is an image viewer. This program needs to sync with several instances of the viewer in different locations (but for test let's say they are all in localhost). There should be one server and several clients which could act as server's too. The example is the timeline. If any server or client moves the timeline the timeline is changed to all clients and server. I have a very simple language of 1 line at a time devised for this sync. I have studied the code of timeout/, the echo-server-client/ and others. I tried modifying them for my needs with some problems. I seem to see the socket closed sometimes and the timers do not wake up all the time. Attached is the code for the server and clients, but without the full viewer. To compile: $ g++ mrvServer.cpp mrvClient.cpp -o server -lboost_system -lboost_thread -lpthread To run: $ server (for server mode) $ server localhost ( for client mode ) Both the client as the sessions are derived from a ParseCommand class, which is used to parse the language and return a boolean to see if the line was understood. The server/client sends an "OK" or "Not OK" message. With the attached files I don't see my problem. The only problem I see is that after 30 seconds or so, the connection gets closed. With the real application, the problem I see is that the server does not modify my timeline (the output queue gets filled but the send function never sends the message as the socket reports it is closed incorrectly). I would appreciate any help with making this work with command-line. |