Hi,
I'm looking for embed babeldoc into a JSP/J2EE container for use it into a web application.
I'm interested to know if are implemented a transaction management that manage the pipeline process.
There is a Transaction Management into babeldoc? How manage the transaction of all PipelineStage during execution of a particular pipeline?
Is this possible to use an external JTA enabled Transaction Manager (for example Jboss transaction manager)?
cheers,
denis
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
yes, you can use an external TM including 2PC protocol. We have done this for BEA WLS.
Here is the basic concept:
- We are starting the Babeldoc pipeline in a MessageDriven Bean, that starts the transaction.
- All DB Resources that are used inside the pipeline are configured as transactional DataSources.
- All log operations are implemented in another EJB, that's running in it's own transaction. Otherwise, in case of an exception the logs would be rolled back, too.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi Stefan,
thank you so much for your advice. I understood your solution and it seems interesting, what I still wonder is
why did you use a MessageDrivenBean to start the pipeline and not a Stateless Session Bean? was it only a requirement of your architecture?
Another doubt is the possibility to propagate the transaction management even to a pipeline stage that doesn't envolve a database, for example the FTP stage component or a filesystem component. Any experience about it?
Thanks again in advance,
Denis
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
we used MDBs because we loosely coupled mupliple pipelines to a large process. Any client can start such a process by sending a JMS Message.
We support multiple Messaging-Provider, i.e. MQSeries to connect to all kind of operating systems.
In case of an error during pipeline processing, you can simple restart the process, by "resending" the message.
Some call such architecture "Enterprise Service Bus". We have also incorporated the scheduler "Quartz" for batch processing.
We haven't used the FTP stage, but use J2CA adapters to connect to backend systems
We read files through our own FileReader and remove the file after the first successful pipeline step. At this point the whole data is stored in a queue and can't get lost anymore.
If you find this solution still interesting, we should talk about working together. You can reach me at stefan.krieger@comporsys.de
Regards,
Stefan
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
I'm looking for embed babeldoc into a JSP/J2EE container for use it into a web application.
I'm interested to know if are implemented a transaction management that manage the pipeline process.
There is a Transaction Management into babeldoc? How manage the transaction of all PipelineStage during execution of a particular pipeline?
Is this possible to use an external JTA enabled Transaction Manager (for example Jboss transaction manager)?
cheers,
denis
Hi Denis,
yes, you can use an external TM including 2PC protocol. We have done this for BEA WLS.
Here is the basic concept:
- We are starting the Babeldoc pipeline in a MessageDriven Bean, that starts the transaction.
- All DB Resources that are used inside the pipeline are configured as transactional DataSources.
- All log operations are implemented in another EJB, that's running in it's own transaction. Otherwise, in case of an exception the logs would be rolled back, too.
Hi Stefan,
thank you so much for your advice. I understood your solution and it seems interesting, what I still wonder is
why did you use a MessageDrivenBean to start the pipeline and not a Stateless Session Bean? was it only a requirement of your architecture?
Another doubt is the possibility to propagate the transaction management even to a pipeline stage that doesn't envolve a database, for example the FTP stage component or a filesystem component. Any experience about it?
Thanks again in advance,
Denis
Denis,
we used MDBs because we loosely coupled mupliple pipelines to a large process. Any client can start such a process by sending a JMS Message.
We support multiple Messaging-Provider, i.e. MQSeries to connect to all kind of operating systems.
In case of an error during pipeline processing, you can simple restart the process, by "resending" the message.
Some call such architecture "Enterprise Service Bus". We have also incorporated the scheduler "Quartz" for batch processing.
We haven't used the FTP stage, but use J2CA adapters to connect to backend systems
We read files through our own FileReader and remove the file after the first successful pipeline step. At this point the whole data is stored in a queue and can't get lost anymore.
If you find this solution still interesting, we should talk about working together. You can reach me at stefan.krieger@comporsys.de
Regards,
Stefan