From: Mitch S. <mit...@be...> - 2009-08-14 22:43:10
|
Hi, I'm sorry that I missed this message when it first came across the list. I'm our lab's resident R advocate, so I'm definitely on board with your use case. I thought about it for a bit; I think you're right that trusting only the local host would address the security concerns. But it only applies to the (relatively small group of) users who will run some kind of web server on the same machine as their web browser, right? So I'm not sure how far it advances the ball. And it's sort of a complicated story to tell users--"you can enter a url, but it has to be for your machine". I still think the right place to aggregate data is on the server (and not the client). The user could enter an arbitrary URL, and then the server would get the data from there, process it, and send it to the user. That should address your use case, right? And also work for a large swath of other users, and address the security issues, and it wouldn't require users to have extra software running on their machines. I looked at the xmapbridge documentation, and it says: ===== Importantly, all graph rendering is performed on the local machine so none of the underlying graph data is uploaded to the X:Map server to generate your graphs. ===== I've been assuming that people would be willing to upload data to the server in order to visualize it in context with other genomic data. So I'm interested in the reasons why the people who wrote xmapbridge felt that it was important not to upload data to the server. Is it a confidentiality thing? Or a data volume thing? Or something else entirely? As far as I've seen, the set of people who will upload data to the server (or point the server toward a flat file or DAS source somewhere else on the web) is larger than the set of people who want to keep the data on their machine and can run all the necessary software locally. Part of the point of JBrowse (as opposed to, say, IGB) is the assumption that people find it easier if they don't have to install software locally. Perhaps you're happy running xmapbridge on your machine, but I imagine that you need to share your visualizations with the people you work with, and are they also R-savvy? So I plan to tackle the server-aggregation approach first. We might work on this down the road, though. And it's open source, so you can do whatever you like, of course. Regards, Mitch On 07/23/2009 11:54 AM, Michael Lawrence wrote: > Hi, > > Just heard about JBrowse. Love the interactivity. > > I would like to serve data from R, running on the local machine, to > JBrowse. X:Map can already do this using the xmapbridge package. > Basically, they use JSONP to communicate with a Java web server that > interacts via the file system with R. I'd like to do something similar > with JBrowse. I understand the security concerns, but perhaps the > local host can be trusted? The user could enter URLs into JBrowse, and > the data would be retrieved from a web server embedded in R. > > It would be great if JBrowse could support this. > > Thanks, > Michael > ------------------------------------------------------------------------ > > ------------------------------------------------------------------------------ > > ------------------------------------------------------------------------ > > _______________________________________________ > Gmod-ajax mailing list > Gmo...@li... > https://lists.sourceforge.net/lists/listinfo/gmod-ajax > |