You can subscribe to this list here.
2009 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(1) |
Sep
|
Oct
(1) |
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 |
Jan
|
Feb
(11) |
Mar
(17) |
Apr
(12) |
May
(2) |
Jun
(20) |
Jul
(2) |
Aug
(2) |
Sep
(2) |
Oct
(2) |
Nov
|
Dec
(5) |
2011 |
Jan
(4) |
Feb
(1) |
Mar
(2) |
Apr
(2) |
May
(5) |
Jun
|
Jul
(12) |
Aug
(4) |
Sep
(5) |
Oct
(1) |
Nov
(38) |
Dec
(27) |
2012 |
Jan
(46) |
Feb
(182) |
Mar
(83) |
Apr
(22) |
May
(68) |
Jun
(47) |
Jul
(135) |
Aug
(84) |
Sep
(57) |
Oct
(45) |
Nov
(27) |
Dec
(61) |
2013 |
Jan
(59) |
Feb
(78) |
Mar
(66) |
Apr
(107) |
May
(27) |
Jun
(56) |
Jul
(53) |
Aug
(3) |
Sep
(19) |
Oct
(41) |
Nov
(44) |
Dec
(54) |
2014 |
Jan
(49) |
Feb
(72) |
Mar
(22) |
Apr
(41) |
May
(63) |
Jun
(27) |
Jul
(45) |
Aug
(12) |
Sep
(3) |
Oct
(8) |
Nov
(27) |
Dec
(16) |
2015 |
Jan
(3) |
Feb
(20) |
Mar
(6) |
Apr
(4) |
May
(15) |
Jun
(2) |
Jul
(4) |
Aug
|
Sep
(1) |
Oct
(1) |
Nov
|
Dec
|
2016 |
Jan
|
Feb
|
Mar
|
Apr
(16) |
May
(9) |
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Rob V. <rv...@do...> - 2012-09-14 21:18:04
|
I never heard from anyone so I wrote up mini-bios on everyone which can be found at http://www.dotnetrdf.org/content.asp?pageID=Developers Any objections/changes please let me know Rob From: Rob Vesse <rv...@vd...> Reply-To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Date: Monday, August 20, 2012 2:06 PM To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Subject: SPAM-LOW: [dotNetRDF-Develop] Developer Bios on Project Website? > Hi All > > Currently we have brief bios of myself and Ron on the Project Homepage, what I > would like to do is change those to be simple 1 line statements of the form "X > works as Job Title at Company" and then link to a separate developer bios page > where we could put more information about each developer and their > contributions to the project and other interests e.g. side-projects, > commercial work with dotNetRDF or otherwise etc. > > If you'd like a more interesting bio than the one I will write up for you > later this week which will likely just summarize who you are, company you work > for and your involvement in the project then please send me a quick bio in > reply to this email. > > Rob > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the ways > today's security and threat landscape has changed and how IT managers can > respond. Discussions will include endpoint security, mobile security and the > latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_____________________ > __________________________ dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: antonio n. <nun...@gm...> - 2012-09-13 21:40:06
|
Thank you Rob. You're right, I was looking for such solution that avoids the inferencing mechanism in OWLim-Lite after a delete. At the moment I sent a request in order to try OWLim-SE that could offer best performance for insert and delete operations. Regards, -Antonio 2012/9/13 Rob Vesse <rv...@do...> > Hi Antonio > > What exactly do you mean by "Could I avoid to trigger the reconstruction > of the graph with dotNetRDF?" > > I'm not sure if this is currently possible with dotNetRDF > > As I understand the problem from your questions on the OWLIM mailing list > the issue is down to the inferencing strategy used by OWLIM so I'm not sure > if changing anything with dotNetRDF would resolve your problem but you > haven't explained it enough for me to understand what if anything dotNetRDF > would need to do > > Rob > > From: antonio nunziante <nun...@gm...> > Reply-To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Date: Thursday, September 13, 2012 1:31 AM > To: <dot...@li...> > Subject: [dotNetRDF-Develop] improve performance with OWLIM-Lite > > Hi all, > > I realized a web application that communicate with a OWLIM-Lite > repository. > My problem is that with this kind of repository updating the graph (for > example with a delete or insert operation) needs to much time. > > Could I avoid to trigger the reconstruction of the graph with dotnetrdf? > > Moreover, how could I turn off inference when I query the store? > > Thanks all > > -- > * > Antonio Nunziante > * > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the > ways today's security and threat landscape has changed and how IT managers > can respond. Discussions will include endpoint security, mobile security > and the latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_______________________________________________dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > -- * Antonio Nunziante * |
From: Rob V. <rv...@do...> - 2012-09-13 19:22:16
|
Hi Antonio What exactly do you mean by "Could I avoid to trigger the reconstruction of the graph with dotNetRDF?" I'm not sure if this is currently possible with dotNetRDF As I understand the problem from your questions on the OWLIM mailing list the issue is down to the inferencing strategy used by OWLIM so I'm not sure if changing anything with dotNetRDF would resolve your problem but you haven't explained it enough for me to understand what if anything dotNetRDF would need to do Rob From: antonio nunziante <nun...@gm...> Reply-To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Date: Thursday, September 13, 2012 1:31 AM To: <dot...@li...> Subject: [dotNetRDF-Develop] improve performance with OWLIM-Lite > Hi all, > > I realized a web application that communicate with a OWLIM-Lite repository. > My problem is that with this kind of repository updating the graph (for > example with a delete or insert operation) needs to much time. > > Could I avoid to trigger the reconstruction of the graph with dotnetrdf? > > Moreover, how could I turn off inference when I query the store? > > Thanks all > > -- > Antonio Nunziante > > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the ways > today's security and threat landscape has changed and how IT managers can > respond. Discussions will include endpoint security, mobile security and the > latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_____________________ > __________________________ dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Rob V. <rv...@do...> - 2012-09-13 16:07:56
|
Yes but I didn't get chance to look in detail to see whether that would solve the problem yet Rob On 9/12/12 9:56 PM, "Graham Moore" <gr...@ne...> wrote: >On 12 September 2012 23:12, Rob Vesse <rv...@do...> wrote: >> I didn't think Slice.cs was really that scary, but I will add some extra >> comments to that code anyway >> > >Sorry, it was more that it went down into LazyBgp than slice itself. > >> The problem as your later email suggests is almost certainly in the lazy >> evaluation of such queries not in the general case evaluation >> > >Did you see the email with the suggested fix? > >> Rob >> >> On 9/11/12 6:49 AM, "Graham Moore" <gr...@ne...> wrote: >> >>>Hi Rob, >>> >>>Just hit a possibly nasty issue with the 0.7.0 release (it may be a >>>slightly later version). >>> >>>If I have a limit query such as: >>> >>>PREFIX tf: <http://theforce.net/schema/> >>>SELECT ?entry ?planet WHERE { >>> ?entry tf:type <http://theforce.net/data/category> . >>> ?entry tf:planet ?planet . >>>} LIMIT 20"; >>> >>>And there are less than 20 results then the resultset contains NO >>>variable bindings for ?planet. If the limit is less or equal to the >>>number of results it works as expected. >>> >>>I just started to look into Slice.cs (it was a bit scary) so I thought >>>I'd ask you if you were aware of this or had any quick pointers as to >>>how should go about fixing it, or even better if you know its been >>>fixed. >>> >>>Thanks, >>> >>>Graham >>> >>>-- >>>Graham Moore, Director, Networked Planet Limited >>>Editor XTM 1.0, ISO13250 (TopicMaps) -2,-3, TMCL >>>e: gra...@ne... >>>w: www.networkedplanet.com >>>t: +44 1865 811131 >>>m: +47 90056479 (Norway) >>> >>>Networked Planet Limited is registered in England and Wales, no. 5273377 >> >> >> >> > > > >-- >Graham Moore, Director, Networked Planet Limited >Editor XTM 1.0, ISO13250 (TopicMaps) -2,-3, TMCL >e: gra...@ne... >w: www.networkedplanet.com >t: +44 1865 811131 >m: +47 90056479 (Norway) > >Networked Planet Limited is registered in England and Wales, no. 5273377 |
From: antonio n. <nun...@gm...> - 2012-09-13 08:31:15
|
Hi all, I realized a web application that communicate with a OWLIM-Lite repository. My problem is that with this kind of repository updating the graph (for example with a delete or insert operation) needs to much time. Could I avoid to trigger the reconstruction of the graph with dotnetrdf? Moreover, how could I turn off inference when I query the store? Thanks all -- * Antonio Nunziante * |
From: Rob V. <rv...@do...> - 2012-09-12 21:13:13
|
I didn't think Slice.cs was really that scary, but I will add some extra comments to that code anyway The problem as your later email suggests is almost certainly in the lazy evaluation of such queries not in the general case evaluation Rob On 9/11/12 6:49 AM, "Graham Moore" <gr...@ne...> wrote: >Hi Rob, > >Just hit a possibly nasty issue with the 0.7.0 release (it may be a >slightly later version). > >If I have a limit query such as: > >PREFIX tf: <http://theforce.net/schema/> >SELECT ?entry ?planet WHERE { > ?entry tf:type <http://theforce.net/data/category> . > ?entry tf:planet ?planet . >} LIMIT 20"; > >And there are less than 20 results then the resultset contains NO >variable bindings for ?planet. If the limit is less or equal to the >number of results it works as expected. > >I just started to look into Slice.cs (it was a bit scary) so I thought >I'd ask you if you were aware of this or had any quick pointers as to >how should go about fixing it, or even better if you know its been >fixed. > >Thanks, > >Graham > >-- >Graham Moore, Director, Networked Planet Limited >Editor XTM 1.0, ISO13250 (TopicMaps) -2,-3, TMCL >e: gra...@ne... >w: www.networkedplanet.com >t: +44 1865 811131 >m: +47 90056479 (Norway) > >Networked Planet Limited is registered in England and Wales, no. 5273377 |
From: Tomasz P. <tom...@gm...> - 2012-08-29 17:16:47
|
I have had the same recently. Some kind of a bug with fresh repositories. They do fix it quickly though. And right about command line. The hginit.com explains very nicely using command line as examples. For every day tasks however I still would use a GUI tool more Tom On Wed, Aug 29, 2012 at 6:27 PM, Rob Vesse <rv...@do...> wrote: > One issue I've run into with BitBucket is that some repositories will give > an error "remote: ssl required" when trying to push to the repo from your > local machine > > I have an open ticket for them to address this but a workaround is to use > SSH push instead of HTTPS, see > https://confluence.atlassian.com/display/BITBUCKET/Set+up+SSH+for+Mercurial > for documentation > > Btw I rather like the Mercurial command line, especially useful for really > understanding how Mercurial works > > Rob > > From: Ron Michael Zettlemoyer <ro...@ze...> > Reply-To: dotNetRDF Developer Discussion and Feature Request > <dot...@li...> > Date: Wednesday, August 29, 2012 6:14 AM > To: dotNetRDF Developer Discussion and Feature Request > <dot...@li...> > Subject: Re: [dotNetRDF-Develop] Mercurial Repositories - Please Review > > To add to the recommendations: if you want a Mac client, Atlassian's > SouceTree is a good one. Perhaps the only one! > > http://www.sourcetreeapp.com/ > > > On Wed, Aug 29, 2012 at 7:45 AM, Tomasz Pluskiewicz > <tom...@gm...> wrote: >> >> Hi >> >> For anyone interested I recommend using TortoiseHg >> (http://tortoisehg.bitbucket.org/) and VisualHg >> (http://visualhg.codeplex.com/) >> >> The first integrates with Explorer and has the excelent Hg Workbench >> tool I use everyday. The latter adds Mercurial integration to Visual >> Studio. This is very important to track file renames. See the >> Documentation tab for help setting up. Uber simple. >> >> I have some suggestions for novice Mercurial user I felt were easiest >> expressed with video but the CamStudio recorder crashed twice already >> after I had been talking for like 20 minutes :(. >> >> Good luck >> Tom >> >> On Wed, Aug 29, 2012 at 12:36 AM, Rob Vesse <rv...@do...> wrote: >> > Hi All >> > >> > With Tom's help and guidance I have pushed my initial efforts at >> > converting >> > the SVN repository to Mercurial up live on SourceForge >> > >> > Please take a look at the repos at >> > http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ and let me know >> > what >> > you think. Unless anyone has any major objections I'd like to switch >> > over >> > to using that immediately, I've been playing with Mercurial for a few >> > days >> > and it is really easy to get the hang of. >> > >> > There is an excellent tutorial up at http://hginit.com and there is also >> > a >> > detailed user manual at http://hgbook.red-bean.com >> > >> > Note that the binary repos were just done with a copy paste and commit >> > rather than a full conversion because I didn't really see much value in >> > converting the past commit history of those binaries. That can still be >> > obtained from SVN if anyone actually cared. >> > >> > If this looks good to people we'll get these cloned over to BitBucket >> > and >> > start using that as our day to day repositories, as Tom suggested >> > previously >> > we'll periodically push changes back to the SF repo so we keep a copy of >> > the >> > code up to date there as well. One of the benefits of this is that >> > people >> > who discover us only through SourceForge will always be able to find the >> > code but we can utilize the user and collaboration friendly features >> > over at >> > BitBucket for our day to day work. >> > >> > Rob >> > >> > >> > ------------------------------------------------------------------------------ >> > Live Security Virtual Conference >> > Exclusive live event will cover all the ways today's security and >> > threat landscape has changed and how IT managers can respond. >> > Discussions >> > will include endpoint security, mobile security and the latest in >> > malware >> > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >> > _______________________________________________ >> > dotNetRDF-develop mailing list >> > dot...@li... >> > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >> > >> >> >> ------------------------------------------------------------------------------ >> Live Security Virtual Conference >> Exclusive live event will cover all the ways today's security and >> threat landscape has changed and how IT managers can respond. Discussions >> will include endpoint security, mobile security and the latest in malware >> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >> _______________________________________________ >> dotNetRDF-develop mailing list >> dot...@li... >> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the > ways today's security and threat landscape has changed and how IT managers > can respond. Discussions will include endpoint security, mobile security and > the latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_______________________________________________ > dotNetRDF-develop mailing list dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > |
From: Rob V. <rv...@do...> - 2012-08-29 16:28:29
|
One issue I've run into with BitBucket is that some repositories will give an error "remote: ssl required" when trying to push to the repo from your local machine I have an open ticket for them to address this but a workaround is to use SSH push instead of HTTPS, see https://confluence.atlassian.com/display/BITBUCKET/Set+up+SSH+for+Mercurial for documentation Btw I rather like the Mercurial command line, especially useful for really understanding how Mercurial works Rob From: Ron Michael Zettlemoyer <ro...@ze...> Reply-To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Date: Wednesday, August 29, 2012 6:14 AM To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Subject: Re: [dotNetRDF-Develop] Mercurial Repositories - Please Review > To add to the recommendations: if you want a Mac client, Atlassian's SouceTree > is a good one. Perhaps the only one! > > http://www.sourcetreeapp.com/ > > > On Wed, Aug 29, 2012 at 7:45 AM, Tomasz Pluskiewicz > <tom...@gm...> wrote: >> Hi >> >> For anyone interested I recommend using TortoiseHg >> (http://tortoisehg.bitbucket.org/) and VisualHg >> (http://visualhg.codeplex.com/) >> >> The first integrates with Explorer and has the excelent Hg Workbench >> tool I use everyday. The latter adds Mercurial integration to Visual >> Studio. This is very important to track file renames. See the >> Documentation tab for help setting up. Uber simple. >> >> I have some suggestions for novice Mercurial user I felt were easiest >> expressed with video but the CamStudio recorder crashed twice already >> after I had been talking for like 20 minutes :(. >> >> Good luck >> Tom >> >> On Wed, Aug 29, 2012 at 12:36 AM, Rob Vesse <rv...@do...> wrote: >>> > Hi All >>> > >>> > With Tom's help and guidance I have pushed my initial efforts at >>> converting >>> > the SVN repository to Mercurial up live on SourceForge >>> > >>> > Please take a look at the repos at >>> > http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ and let me know what >>> > you think. Unless anyone has any major objections I'd like to switch over >>> > to using that immediately, I've been playing with Mercurial for a few days >>> > and it is really easy to get the hang of. >>> > >>> > There is an excellent tutorial up at http://hginit.com and there is also a >>> > detailed user manual at http://hgbook.red-bean.com >>> > >>> > Note that the binary repos were just done with a copy paste and commit >>> > rather than a full conversion because I didn't really see much value in >>> > converting the past commit history of those binaries. That can still be >>> > obtained from SVN if anyone actually cared. >>> > >>> > If this looks good to people we'll get these cloned over to BitBucket and >>> > start using that as our day to day repositories, as Tom suggested >>> previously >>> > we'll periodically push changes back to the SF repo so we keep a copy of >>> the >>> > code up to date there as well. One of the benefits of this is that people >>> > who discover us only through SourceForge will always be able to find the >>> > code but we can utilize the user and collaboration friendly features over >>> at >>> > BitBucket for our day to day work. >>> > >>> > Rob >>> > >>> > >>> ---------------------------------------------------------------------------->>> -- >>> > Live Security Virtual Conference >>> > Exclusive live event will cover all the ways today's security and >>> > threat landscape has changed and how IT managers can respond. Discussions >>> > will include endpoint security, mobile security and the latest in malware >>> > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>> > _______________________________________________ >>> > dotNetRDF-develop mailing list >>> > dot...@li... >>> > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >>> > >> >> ----------------------------------------------------------------------------->> - >> Live Security Virtual Conference >> Exclusive live event will cover all the ways today's security and >> threat landscape has changed and how IT managers can respond. Discussions >> will include endpoint security, mobile security and the latest in malware >> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >> _______________________________________________ >> dotNetRDF-develop mailing list >> dot...@li... >> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the ways > today's security and threat landscape has changed and how IT managers can > respond. Discussions will include endpoint security, mobile security and the > latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_____________________ > __________________________ dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Ron M. Z. <ro...@ze...> - 2012-08-29 13:15:22
|
To add to the recommendations: if you want a Mac client, Atlassian's SouceTree is a good one. Perhaps the only one! http://www.sourcetreeapp.com/ On Wed, Aug 29, 2012 at 7:45 AM, Tomasz Pluskiewicz < tom...@gm...> wrote: > Hi > > For anyone interested I recommend using TortoiseHg > (http://tortoisehg.bitbucket.org/) and VisualHg > (http://visualhg.codeplex.com/) > > The first integrates with Explorer and has the excelent Hg Workbench > tool I use everyday. The latter adds Mercurial integration to Visual > Studio. This is very important to track file renames. See the > Documentation tab for help setting up. Uber simple. > > I have some suggestions for novice Mercurial user I felt were easiest > expressed with video but the CamStudio recorder crashed twice already > after I had been talking for like 20 minutes :(. > > Good luck > Tom > > On Wed, Aug 29, 2012 at 12:36 AM, Rob Vesse <rv...@do...> wrote: > > Hi All > > > > With Tom's help and guidance I have pushed my initial efforts at > converting > > the SVN repository to Mercurial up live on SourceForge > > > > Please take a look at the repos at > > http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ and let me know > what > > you think. Unless anyone has any major objections I'd like to switch > over > > to using that immediately, I've been playing with Mercurial for a few > days > > and it is really easy to get the hang of. > > > > There is an excellent tutorial up at http://hginit.com and there is > also a > > detailed user manual at http://hgbook.red-bean.com > > > > Note that the binary repos were just done with a copy paste and commit > > rather than a full conversion because I didn't really see much value in > > converting the past commit history of those binaries. That can still be > > obtained from SVN if anyone actually cared. > > > > If this looks good to people we'll get these cloned over to BitBucket and > > start using that as our day to day repositories, as Tom suggested > previously > > we'll periodically push changes back to the SF repo so we keep a copy of > the > > code up to date there as well. One of the benefits of this is that > people > > who discover us only through SourceForge will always be able to find the > > code but we can utilize the user and collaboration friendly features > over at > > BitBucket for our day to day work. > > > > Rob > > > > > ------------------------------------------------------------------------------ > > Live Security Virtual Conference > > Exclusive live event will cover all the ways today's security and > > threat landscape has changed and how IT managers can respond. Discussions > > will include endpoint security, mobile security and the latest in malware > > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > > _______________________________________________ > > dotNetRDF-develop mailing list > > dot...@li... > > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > |
From: Tomasz P. <tom...@gm...> - 2012-08-29 11:46:26
|
Hi For anyone interested I recommend using TortoiseHg (http://tortoisehg.bitbucket.org/) and VisualHg (http://visualhg.codeplex.com/) The first integrates with Explorer and has the excelent Hg Workbench tool I use everyday. The latter adds Mercurial integration to Visual Studio. This is very important to track file renames. See the Documentation tab for help setting up. Uber simple. I have some suggestions for novice Mercurial user I felt were easiest expressed with video but the CamStudio recorder crashed twice already after I had been talking for like 20 minutes :(. Good luck Tom On Wed, Aug 29, 2012 at 12:36 AM, Rob Vesse <rv...@do...> wrote: > Hi All > > With Tom's help and guidance I have pushed my initial efforts at converting > the SVN repository to Mercurial up live on SourceForge > > Please take a look at the repos at > http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ and let me know what > you think. Unless anyone has any major objections I'd like to switch over > to using that immediately, I've been playing with Mercurial for a few days > and it is really easy to get the hang of. > > There is an excellent tutorial up at http://hginit.com and there is also a > detailed user manual at http://hgbook.red-bean.com > > Note that the binary repos were just done with a copy paste and commit > rather than a full conversion because I didn't really see much value in > converting the past commit history of those binaries. That can still be > obtained from SVN if anyone actually cared. > > If this looks good to people we'll get these cloned over to BitBucket and > start using that as our day to day repositories, as Tom suggested previously > we'll periodically push changes back to the SF repo so we keep a copy of the > code up to date there as well. One of the benefits of this is that people > who discover us only through SourceForge will always be able to find the > code but we can utilize the user and collaboration friendly features over at > BitBucket for our day to day work. > > Rob > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > |
From: Tomasz P. <tom...@gm...> - 2012-08-29 09:33:50
|
Hi Rob Great we are getting close here :) On Wed, Aug 29, 2012 at 12:29 AM, Rob Vesse <rv...@do...> wrote: > Hey Tom > > Yet more comments... > > On 8/28/12 1:57 AM, "Tomasz Pluskiewicz" <tom...@gm...> > wrote: > >>Hi Rob >> >>Comments inline again >> >>Tom >> >>On Mon, Aug 27, 2012 at 7:11 PM, Rob Vesse <rv...@do...> wrote: >>> Hey Tom >>> >>> I have not had chance to look properly yet but a few comments inline... >>> >>> On 8/25/12 7:47 AM, "Tomasz Pluskiewicz" <tom...@gm...> >>> wrote: >>> >>>>Hi >>>> >>>>After much thought I have created a bitbucket team for dotNetRDF and >>>>not one but multiple repositories. All of them imported as of revision >>>>2374. Further changes will have to be applied manually but that >>>>shouldn't be much problem. >>> >>> Do you have notes on how to perform the SVN to HG conversion? Or a >>> pointer to a good tutorial? >>> >> >>There are two steps there: svnsync to create a local copy of the >>repository and the hg convert extension. I have just uploaded the >>script to a repository: https://bitbucket.org/tpluscode/dnr-migration >> >>This is exactly the way I did before. However it is not complete, >>because I had to manually recreate/sync tags between repositories and >>change the hierarchy so that the lib, core and utilities are >>subrepositories of th build repository. I think this can work in an >>incremental manner. Just rerun the batch file when svn commits appear >>and they will get aplied on top of the hg repositories. There should >>be no problems until one would start to fiddle with the commit history >>in hg after converting > > Thanks, you'll find a pull request from me on Bitbucket with updated > scripts. Btw can you add me (rvesse) to the dotnetrdf team? > Did that and removed the previous repositores and imported the new from SF. I used my old email to create the team. Please add a new email for the project and remove the current. Also I think you might revoke my admin rights and just leave me as developer. > > After a bit of fiddling I got it to a state I liked, added a few extra > tags and I pushed the conversion up to > http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/dotnetrdf > > Let me know what you think. > Looks great. I'm glad we've reached a compromise. Also it probably is for the better we won't be using these subrepositories. > >> >>> >>>>I'm thinking maybe we could meet online >>>>using join.me/VoIP? It would be easiest and quickest way for us to >>>>discuss my proposed version control changes and possibly establish >>>>some common workflow. Who would be interested. Also what are you >>>>timezones? I'm in Poland so it's UTC+1. >>> >>> California, UTC-8 >>> >>>> >>>>In short there are four repositories on bitbucket >>>>(https://bitbucket.org/dotnetrdf): >>>> >>>>- core >>>>- utilities >>>>- documentation >>>>- build >>> >>> So these are the proposed masters right now? >>> >>> I am strongly against hosting the masters off SourceForge, if people >>>want >>> to clone the master repos from SourceForge off to a remote service like >>> BitBucket that is fine by me but I intend to continue using SF as the >>>VCS >>> hosting provider. >>> >> >>Out of curiosity could you elaborate why are you relucatant to leave >>SF as the code host? > > Primarily this is just an issue of continuity, maintaining SF as the > central record for source code > >>At this moment I don't see any advantages of SF. >>Still it has some shortcomings and missing features: >> >>- it seems it isn't possible to host multiple repositories for one >>projects - minor but a limitation nonetheless > > Per your later comment it does allow this > >>- bitbucket integrates nicely with pull requests, meaning that when >>someone outside the team wants to contribute they just need one click >>on bitbucket to notify you >>- on bitbucket it is possible to view difference between forks online > > I have to agree these are nice features and since this in DVCS we can have > a compromise. We maintain the SF repo as the "stable" repo which only I > push to and we do everything else through repos hosted on bitbucket for > the user friendliness. Periodically I will push the changes from our > "master" on bitbucket over to SF > > Which is fairly similar to what you suggested in the first place, I guess > you talked me round in the end :-) (A few days of playing with mercurial > helped as well) > In the end I think you convinced me as well that as far as the team is concerned pulling and pushing may not be required. At least not while working on bugfixes and minor stuff. Myself I would be creating a clone when I wanted to work on some bigger part or just experiment. > >> >>In any case I'm fine with sticking to SF, but in that case the >>migration script will need rethinking. > > Per my earlier comment, already done, thanks for getting this started > >> >>> >>> Is there a way to modify those repositories so they become clones of the >>> masters at SF as and when we get those up and running? >>> >> >>You see this is the beauty of distributed code. You don't have to do >>anything. If by master you mean the most up-to-date repository, then >>it is only a matter of agreement within a team. You can just clone >>your repository to a new destination and start using it as your new >>master. Because they are related you can always go back to the old one >>and all it takes in terms of an open source project, is change the URL >>advertised on the website, so that devs can update it in the VCS >>clients. > > Having played with this I understand it now, as I said above I can clone > the SF repo to bitbucket, then make my own clone on bitbucket, pushing > changes to that and accepting pull requests for the user friendliness and > then push the changes to the bitbucket "master" and the SF repo as > necessary > >> >>> >>>> >>>>Core is most of trunk except build and utilities, which got their own >>>>repositories. >>>>Documentation is separate because it was in it's own folder under >>>>SVN's root and there was no way I know to merge it with core but >>>>keeping history. >>>>Build is a bit different. It contains the others as subrepositories >>>>and when pulled it pull the others along and when pushed it pushes the >>>>others too. As such I think that only Rob should be pushing to the >>>>Build repository. >>> >>> I disagree, any committer should be able to push to any of our >>> repositories. The beauty of version control is that if someone does >>> something we disagree on we can always roll it back. >>> >> >>I agree totally. However giving anyone direct access to all >>repositories could encourage mess. Again, with version control like >>CVS or SVN that is the only way. With DVCS you can limit wrtie access >>to you masters and still allow people to work on the code and commit >>to their own clones. And when they are ready to contribute they ask to >>pull their changes. With a project where there is one lead developer >>this may not be so obvious but you sure must have experienced that >>with SVN: someone commits broken code and it is immediately >>distributed among other team members. DVCS gives you the power to >>avoid that. >> >>> >>>>It actually should keep little more than build >>>>scripts and (nightly) builds. It sould also be used to sync tags >>>>created in the other repositories. I have done that by linking the >>>>051, 060, 061, 062, 070 and 071. Updating to any one of them in Build >>>>updates the other repositories to the same tag. Any older tags are >>>>lost because before aroung 2010/08 the repository had a different >>>>layout and the SVN to Hg conversion can't guess that transition. Other >>>>than that I don't think it is vital. The commits are still in SVN for >>>>reference anyway. >>> >>> I have to admit to being slightly confused about why you would bother >>> splitting into separate repositories at all given what you've just >>> described. For example the utilities need the core to build so a >>> developer would have to check both out regardless so why not have the >>> "build" repository just be a trunk repository with trunk pretty much as >>>it >>> exists now. >>> >> >>It could be this way. My layout is just a proposal and it may indeed >>be overly complicated for now. >> >>> >>> However splitting out the documentation makes sense, I would also >>>suggest >>> splitting our the checked in binaries into two separate repositories - >>> binaries-nightly and binaries-stable - >>> http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ >>> >> >>Oh I have just clicked the link above. So it is possible to have >>multiple repositories no SF after all. Than some of what I have >>already written is no longer valid :) >> >>Your proposed layout is simpler and simpler often means better. I will >>update the migration script so that it produces excatly that for >>comparison >> >>> >>>> >>>>What have I done: >>>> >>>>1. Update paths in projects/files so that all builds fine >>>>2. Added a stripped version of dotNetRDF.sln for people who would >>>>clone just the core project and renamed the other dotNetRDF.All.sln >>> >>> Regardless of the final repo structure having separate solutions for >>> people who want partial builds makes sense to me +1 > > Already added a dotNetRDF.Core.sln to the SF repo > >>> >>>> >>>>Things I didn't finish: >>>> >>>>1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl >>>>files for me. >>> >>> I'll fix this in SVN > > > Fixed in SF repo instead > >>> >>>>2. dotNetRDF.snk file is missing >>> >>> Think that got added after the revision you converted > > Was present in the revision I converted so is in the SF repo > >>> >>>>3. The web projects don't load for me at the moment (see question 2. >>>>below) >>>>4. I cannot run unit tests (I am currently using VS Express + >>>>Lightswitch and I'm unable to open the unit tests project) >>> >>> Likely a limitation of those versions of VS >>> >> >>Yea. I should buy VS Pro soon. > > I have been lucky enough not to have to do that for some time thanks to > MSDNAA and university site licenses. > I know. Have one still installed on my old laptop, Ultimate version even. My MSDNAA time > >> >>> >>>>5. File bad-example.ttl is missing in rdfWebDeploy project >>> >>> It isn't needed, I'll remove it from the project information. > > Removed from SF repo > >>> >>>>6. I haven't touched the Build folder because I don't want to break >>>>something. I think updating path shoul be enough. >>>> >>>>I have some questions/suggestions: >>>> >>>>1. I think that the *.suo and *.user files shouldn't be committed I >>>>think. Is there a reason why they are? >>> >>> Not really > > Removed from SF repo > >>> >>>>2. For portability I suggest using IIS Express for the WebDemos, BSBM >>>>and sp2b projects. Won't require having IIS installed. >>> >>> So VS Express actually won't load projects that use IIS? Does it give >>>you >>> some sort of error when you try and run the project? >>> >> >>It says "The Web Application Project WebDemos is configured to use >>IIS. The Web server 'http://localhost/demos' could not be found" >> >>> >>> I'd much prefer to stick IIS, and I assume here by IIS Express you mean >>> the VS Development server? No-one is going to deploy in production with >>> the development server and I would much rather build and test on a real >>> IIS environment. >>> >> >>No I don't mean VS Development server. I mean IIS Express >>(http://learn.iis.net/page.aspx/868/iis-express-overview/), which is a >>simpler version of IIS 7.5 but fully compatible with it unline the >>built-in server. It is instaleld online or with the Web Platform >>installer. It can even be installed directly from VS 2010 (I think SP1 >>is required) and VS Web Developer Express. All you need to do is >>right-click on a web project and choose "Use IIS Express". Should you >>open the solution a new system, Visual Studio also tells you that IIS >>Express is required and let's you install. No administrator rights >>required as with full IIS. > > Hmmm, I don't have this option on a web project but from reading around > sounds like you need IIS Express installed first > > I will experiment and update the projects in the SF repo when I have > resolved this > I think the VS-IIS Express integration requires Visual Studio SP1 installed. > > Rob > >> >>> >>> Btw anyone can install IIS Express on most Windows installations by >>>going >>> to Control Panel > Programs > Turns Windows features on or off and then >>> select the IIS components they need for the list (assuming Windows 7). >>> You can do this on Windows XP/Windows Vista as well the control panel >>> naming is just slightly different - I believe it is under Add/Remove >>> Programs on those versions. With this installed you can run IIS based >>> projects assuming your user account has sufficient privileges on the >>> machine. >>> >> >>That is not the case. Please read above :) >> >>> >>>>3. Some of the references are available on Nuget. Maybe we could add >>>>them to the projects with Nuget? I think it has been brought up on the >>>>list recently... >>> >>> This is only useful for the projects we build within VS2010, this >>>doesn't >>> help us with building for other platforms via NAnt because of the >>> interactions it would have with how we template those builds. There's >>>an >>> open issue to update the NuGet package specifications so we don't >>>include >>> the binary dependencies in the packages and rather rely on NuGet >>> dependency resolution for that and that makes sense because it's only >>> affecting users who consume our packages via NuGet. >>> >>> Thanks, >>> >>> Rob >>> >>>> >>>>Regards, >>>>Tom >>>> >>>>------------------------------------------------------------------------ >>>>-- >>>>---- >>>>Live Security Virtual Conference >>>>Exclusive live event will cover all the ways today's security and >>>>threat landscape has changed and how IT managers can respond. >>>>Discussions >>>>will include endpoint security, mobile security and the latest in >>>>malware >>>>threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>>>_______________________________________________ >>>>dotNetRDF-develop mailing list >>>>dot...@li... >>>>https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >>> >>> >>> >>> >>> >>> >>>------------------------------------------------------------------------- >>>----- >>> Live Security Virtual Conference >>> Exclusive live event will cover all the ways today's security and >>> threat landscape has changed and how IT managers can respond. >>>Discussions >>> will include endpoint security, mobile security and the latest in >>>malware >>> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>> _______________________________________________ >>> dotNetRDF-develop mailing list >>> dot...@li... >>> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >> >>-------------------------------------------------------------------------- >>---- >>Live Security Virtual Conference >>Exclusive live event will cover all the ways today's security and >>threat landscape has changed and how IT managers can respond. Discussions >>will include endpoint security, mobile security and the latest in malware >>threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>_______________________________________________ >>dotNetRDF-develop mailing list >>dot...@li... >>https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Rob V. <rv...@do...> - 2012-08-28 22:36:56
|
Hi All With Tom's help and guidance I have pushed my initial efforts at converting the SVN repository to Mercurial up live on SourceForge Please take a look at the repos at http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ and let me know what you think. Unless anyone has any major objections I'd like to switch over to using that immediately, I've been playing with Mercurial for a few days and it is really easy to get the hang of. There is an excellent tutorial up at http://hginit.com and there is also a detailed user manual at http://hgbook.red-bean.com Note that the binary repos were just done with a copy paste and commit rather than a full conversion because I didn't really see much value in converting the past commit history of those binaries. That can still be obtained from SVN if anyone actually cared. If this looks good to people we'll get these cloned over to BitBucket and start using that as our day to day repositories, as Tom suggested previously we'll periodically push changes back to the SF repo so we keep a copy of the code up to date there as well. One of the benefits of this is that people who discover us only through SourceForge will always be able to find the code but we can utilize the user and collaboration friendly features over at BitBucket for our day to day work. Rob |
From: Rob V. <rv...@do...> - 2012-08-28 22:30:27
|
Hey Tom Yet more comments... On 8/28/12 1:57 AM, "Tomasz Pluskiewicz" <tom...@gm...> wrote: >Hi Rob > >Comments inline again > >Tom > >On Mon, Aug 27, 2012 at 7:11 PM, Rob Vesse <rv...@do...> wrote: >> Hey Tom >> >> I have not had chance to look properly yet but a few comments inline... >> >> On 8/25/12 7:47 AM, "Tomasz Pluskiewicz" <tom...@gm...> >> wrote: >> >>>Hi >>> >>>After much thought I have created a bitbucket team for dotNetRDF and >>>not one but multiple repositories. All of them imported as of revision >>>2374. Further changes will have to be applied manually but that >>>shouldn't be much problem. >> >> Do you have notes on how to perform the SVN to HG conversion? Or a >> pointer to a good tutorial? >> > >There are two steps there: svnsync to create a local copy of the >repository and the hg convert extension. I have just uploaded the >script to a repository: https://bitbucket.org/tpluscode/dnr-migration > >This is exactly the way I did before. However it is not complete, >because I had to manually recreate/sync tags between repositories and >change the hierarchy so that the lib, core and utilities are >subrepositories of th build repository. I think this can work in an >incremental manner. Just rerun the batch file when svn commits appear >and they will get aplied on top of the hg repositories. There should >be no problems until one would start to fiddle with the commit history >in hg after converting Thanks, you'll find a pull request from me on Bitbucket with updated scripts. Btw can you add me (rvesse) to the dotnetrdf team? After a bit of fiddling I got it to a state I liked, added a few extra tags and I pushed the conversion up to http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/dotnetrdf Let me know what you think. > >> >>>I'm thinking maybe we could meet online >>>using join.me/VoIP? It would be easiest and quickest way for us to >>>discuss my proposed version control changes and possibly establish >>>some common workflow. Who would be interested. Also what are you >>>timezones? I'm in Poland so it's UTC+1. >> >> California, UTC-8 >> >>> >>>In short there are four repositories on bitbucket >>>(https://bitbucket.org/dotnetrdf): >>> >>>- core >>>- utilities >>>- documentation >>>- build >> >> So these are the proposed masters right now? >> >> I am strongly against hosting the masters off SourceForge, if people >>want >> to clone the master repos from SourceForge off to a remote service like >> BitBucket that is fine by me but I intend to continue using SF as the >>VCS >> hosting provider. >> > >Out of curiosity could you elaborate why are you relucatant to leave >SF as the code host? Primarily this is just an issue of continuity, maintaining SF as the central record for source code >At this moment I don't see any advantages of SF. >Still it has some shortcomings and missing features: > >- it seems it isn't possible to host multiple repositories for one >projects - minor but a limitation nonetheless Per your later comment it does allow this >- bitbucket integrates nicely with pull requests, meaning that when >someone outside the team wants to contribute they just need one click >on bitbucket to notify you >- on bitbucket it is possible to view difference between forks online I have to agree these are nice features and since this in DVCS we can have a compromise. We maintain the SF repo as the "stable" repo which only I push to and we do everything else through repos hosted on bitbucket for the user friendliness. Periodically I will push the changes from our "master" on bitbucket over to SF Which is fairly similar to what you suggested in the first place, I guess you talked me round in the end :-) (A few days of playing with mercurial helped as well) > >In any case I'm fine with sticking to SF, but in that case the >migration script will need rethinking. Per my earlier comment, already done, thanks for getting this started > >> >> Is there a way to modify those repositories so they become clones of the >> masters at SF as and when we get those up and running? >> > >You see this is the beauty of distributed code. You don't have to do >anything. If by master you mean the most up-to-date repository, then >it is only a matter of agreement within a team. You can just clone >your repository to a new destination and start using it as your new >master. Because they are related you can always go back to the old one >and all it takes in terms of an open source project, is change the URL >advertised on the website, so that devs can update it in the VCS >clients. Having played with this I understand it now, as I said above I can clone the SF repo to bitbucket, then make my own clone on bitbucket, pushing changes to that and accepting pull requests for the user friendliness and then push the changes to the bitbucket "master" and the SF repo as necessary > >> >>> >>>Core is most of trunk except build and utilities, which got their own >>>repositories. >>>Documentation is separate because it was in it's own folder under >>>SVN's root and there was no way I know to merge it with core but >>>keeping history. >>>Build is a bit different. It contains the others as subrepositories >>>and when pulled it pull the others along and when pushed it pushes the >>>others too. As such I think that only Rob should be pushing to the >>>Build repository. >> >> I disagree, any committer should be able to push to any of our >> repositories. The beauty of version control is that if someone does >> something we disagree on we can always roll it back. >> > >I agree totally. However giving anyone direct access to all >repositories could encourage mess. Again, with version control like >CVS or SVN that is the only way. With DVCS you can limit wrtie access >to you masters and still allow people to work on the code and commit >to their own clones. And when they are ready to contribute they ask to >pull their changes. With a project where there is one lead developer >this may not be so obvious but you sure must have experienced that >with SVN: someone commits broken code and it is immediately >distributed among other team members. DVCS gives you the power to >avoid that. > >> >>>It actually should keep little more than build >>>scripts and (nightly) builds. It sould also be used to sync tags >>>created in the other repositories. I have done that by linking the >>>051, 060, 061, 062, 070 and 071. Updating to any one of them in Build >>>updates the other repositories to the same tag. Any older tags are >>>lost because before aroung 2010/08 the repository had a different >>>layout and the SVN to Hg conversion can't guess that transition. Other >>>than that I don't think it is vital. The commits are still in SVN for >>>reference anyway. >> >> I have to admit to being slightly confused about why you would bother >> splitting into separate repositories at all given what you've just >> described. For example the utilities need the core to build so a >> developer would have to check both out regardless so why not have the >> "build" repository just be a trunk repository with trunk pretty much as >>it >> exists now. >> > >It could be this way. My layout is just a proposal and it may indeed >be overly complicated for now. > >> >> However splitting out the documentation makes sense, I would also >>suggest >> splitting our the checked in binaries into two separate repositories - >> binaries-nightly and binaries-stable - >> http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ >> > >Oh I have just clicked the link above. So it is possible to have >multiple repositories no SF after all. Than some of what I have >already written is no longer valid :) > >Your proposed layout is simpler and simpler often means better. I will >update the migration script so that it produces excatly that for >comparison > >> >>> >>>What have I done: >>> >>>1. Update paths in projects/files so that all builds fine >>>2. Added a stripped version of dotNetRDF.sln for people who would >>>clone just the core project and renamed the other dotNetRDF.All.sln >> >> Regardless of the final repo structure having separate solutions for >> people who want partial builds makes sense to me +1 Already added a dotNetRDF.Core.sln to the SF repo >> >>> >>>Things I didn't finish: >>> >>>1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl >>>files for me. >> >> I'll fix this in SVN Fixed in SF repo instead >> >>>2. dotNetRDF.snk file is missing >> >> Think that got added after the revision you converted Was present in the revision I converted so is in the SF repo >> >>>3. The web projects don't load for me at the moment (see question 2. >>>below) >>>4. I cannot run unit tests (I am currently using VS Express + >>>Lightswitch and I'm unable to open the unit tests project) >> >> Likely a limitation of those versions of VS >> > >Yea. I should buy VS Pro soon. I have been lucky enough not to have to do that for some time thanks to MSDNAA and university site licenses. > >> >>>5. File bad-example.ttl is missing in rdfWebDeploy project >> >> It isn't needed, I'll remove it from the project information. Removed from SF repo >> >>>6. I haven't touched the Build folder because I don't want to break >>>something. I think updating path shoul be enough. >>> >>>I have some questions/suggestions: >>> >>>1. I think that the *.suo and *.user files shouldn't be committed I >>>think. Is there a reason why they are? >> >> Not really Removed from SF repo >> >>>2. For portability I suggest using IIS Express for the WebDemos, BSBM >>>and sp2b projects. Won't require having IIS installed. >> >> So VS Express actually won't load projects that use IIS? Does it give >>you >> some sort of error when you try and run the project? >> > >It says "The Web Application Project WebDemos is configured to use >IIS. The Web server 'http://localhost/demos' could not be found" > >> >> I'd much prefer to stick IIS, and I assume here by IIS Express you mean >> the VS Development server? No-one is going to deploy in production with >> the development server and I would much rather build and test on a real >> IIS environment. >> > >No I don't mean VS Development server. I mean IIS Express >(http://learn.iis.net/page.aspx/868/iis-express-overview/), which is a >simpler version of IIS 7.5 but fully compatible with it unline the >built-in server. It is instaleld online or with the Web Platform >installer. It can even be installed directly from VS 2010 (I think SP1 >is required) and VS Web Developer Express. All you need to do is >right-click on a web project and choose "Use IIS Express". Should you >open the solution a new system, Visual Studio also tells you that IIS >Express is required and let's you install. No administrator rights >required as with full IIS. Hmmm, I don't have this option on a web project but from reading around sounds like you need IIS Express installed first I will experiment and update the projects in the SF repo when I have resolved this Rob > >> >> Btw anyone can install IIS Express on most Windows installations by >>going >> to Control Panel > Programs > Turns Windows features on or off and then >> select the IIS components they need for the list (assuming Windows 7). >> You can do this on Windows XP/Windows Vista as well the control panel >> naming is just slightly different - I believe it is under Add/Remove >> Programs on those versions. With this installed you can run IIS based >> projects assuming your user account has sufficient privileges on the >> machine. >> > >That is not the case. Please read above :) > >> >>>3. Some of the references are available on Nuget. Maybe we could add >>>them to the projects with Nuget? I think it has been brought up on the >>>list recently... >> >> This is only useful for the projects we build within VS2010, this >>doesn't >> help us with building for other platforms via NAnt because of the >> interactions it would have with how we template those builds. There's >>an >> open issue to update the NuGet package specifications so we don't >>include >> the binary dependencies in the packages and rather rely on NuGet >> dependency resolution for that and that makes sense because it's only >> affecting users who consume our packages via NuGet. >> >> Thanks, >> >> Rob >> >>> >>>Regards, >>>Tom >>> >>>------------------------------------------------------------------------ >>>-- >>>---- >>>Live Security Virtual Conference >>>Exclusive live event will cover all the ways today's security and >>>threat landscape has changed and how IT managers can respond. >>>Discussions >>>will include endpoint security, mobile security and the latest in >>>malware >>>threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>>_______________________________________________ >>>dotNetRDF-develop mailing list >>>dot...@li... >>>https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >> >> >> >> >> >> >>------------------------------------------------------------------------- >>----- >> Live Security Virtual Conference >> Exclusive live event will cover all the ways today's security and >> threat landscape has changed and how IT managers can respond. >>Discussions >> will include endpoint security, mobile security and the latest in >>malware >> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >> _______________________________________________ >> dotNetRDF-develop mailing list >> dot...@li... >> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > >-------------------------------------------------------------------------- >---- >Live Security Virtual Conference >Exclusive live event will cover all the ways today's security and >threat landscape has changed and how IT managers can respond. Discussions >will include endpoint security, mobile security and the latest in malware >threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >_______________________________________________ >dotNetRDF-develop mailing list >dot...@li... >https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Tomasz P. <tom...@gm...> - 2012-08-28 08:58:42
|
Hi Rob Comments inline again Tom On Mon, Aug 27, 2012 at 7:11 PM, Rob Vesse <rv...@do...> wrote: > Hey Tom > > I have not had chance to look properly yet but a few comments inline... > > On 8/25/12 7:47 AM, "Tomasz Pluskiewicz" <tom...@gm...> > wrote: > >>Hi >> >>After much thought I have created a bitbucket team for dotNetRDF and >>not one but multiple repositories. All of them imported as of revision >>2374. Further changes will have to be applied manually but that >>shouldn't be much problem. > > Do you have notes on how to perform the SVN to HG conversion? Or a > pointer to a good tutorial? > There are two steps there: svnsync to create a local copy of the repository and the hg convert extension. I have just uploaded the script to a repository: https://bitbucket.org/tpluscode/dnr-migration This is exactly the way I did before. However it is not complete, because I had to manually recreate/sync tags between repositories and change the hierarchy so that the lib, core and utilities are subrepositories of th build repository. I think this can work in an incremental manner. Just rerun the batch file when svn commits appear and they will get aplied on top of the hg repositories. There should be no problems until one would start to fiddle with the commit history in hg after converting > >>I'm thinking maybe we could meet online >>using join.me/VoIP? It would be easiest and quickest way for us to >>discuss my proposed version control changes and possibly establish >>some common workflow. Who would be interested. Also what are you >>timezones? I'm in Poland so it's UTC+1. > > California, UTC-8 > >> >>In short there are four repositories on bitbucket >>(https://bitbucket.org/dotnetrdf): >> >>- core >>- utilities >>- documentation >>- build > > So these are the proposed masters right now? > > I am strongly against hosting the masters off SourceForge, if people want > to clone the master repos from SourceForge off to a remote service like > BitBucket that is fine by me but I intend to continue using SF as the VCS > hosting provider. > Out of curiosity could you elaborate why are you relucatant to leave SF as the code host? At this moment I don't see any advantages of SF. Still it has some shortcomings and missing features: - it seems it isn't possible to host multiple repositories for one projects - minor but a limitation nonetheless - bitbucket integrates nicely with pull requests, meaning that when someone outside the team wants to contribute they just need one click on bitbucket to notify you - on bitbucket it is possible to view difference between forks online In any case I'm fine with sticking to SF, but in that case the migration script will need rethinking. > > Is there a way to modify those repositories so they become clones of the > masters at SF as and when we get those up and running? > You see this is the beauty of distributed code. You don't have to do anything. If by master you mean the most up-to-date repository, then it is only a matter of agreement within a team. You can just clone your repository to a new destination and start using it as your new master. Because they are related you can always go back to the old one and all it takes in terms of an open source project, is change the URL advertised on the website, so that devs can update it in the VCS clients. > >> >>Core is most of trunk except build and utilities, which got their own >>repositories. >>Documentation is separate because it was in it's own folder under >>SVN's root and there was no way I know to merge it with core but >>keeping history. >>Build is a bit different. It contains the others as subrepositories >>and when pulled it pull the others along and when pushed it pushes the >>others too. As such I think that only Rob should be pushing to the >>Build repository. > > I disagree, any committer should be able to push to any of our > repositories. The beauty of version control is that if someone does > something we disagree on we can always roll it back. > I agree totally. However giving anyone direct access to all repositories could encourage mess. Again, with version control like CVS or SVN that is the only way. With DVCS you can limit wrtie access to you masters and still allow people to work on the code and commit to their own clones. And when they are ready to contribute they ask to pull their changes. With a project where there is one lead developer this may not be so obvious but you sure must have experienced that with SVN: someone commits broken code and it is immediately distributed among other team members. DVCS gives you the power to avoid that. > >>It actually should keep little more than build >>scripts and (nightly) builds. It sould also be used to sync tags >>created in the other repositories. I have done that by linking the >>051, 060, 061, 062, 070 and 071. Updating to any one of them in Build >>updates the other repositories to the same tag. Any older tags are >>lost because before aroung 2010/08 the repository had a different >>layout and the SVN to Hg conversion can't guess that transition. Other >>than that I don't think it is vital. The commits are still in SVN for >>reference anyway. > > I have to admit to being slightly confused about why you would bother > splitting into separate repositories at all given what you've just > described. For example the utilities need the core to build so a > developer would have to check both out regardless so why not have the > "build" repository just be a trunk repository with trunk pretty much as it > exists now. > It could be this way. My layout is just a proposal and it may indeed be overly complicated for now. > > However splitting out the documentation makes sense, I would also suggest > splitting our the checked in binaries into two separate repositories - > binaries-nightly and binaries-stable - > http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ > Oh I have just clicked the link above. So it is possible to have multiple repositories no SF after all. Than some of what I have already written is no longer valid :) Your proposed layout is simpler and simpler often means better. I will update the migration script so that it produces excatly that for comparison > >> >>What have I done: >> >>1. Update paths in projects/files so that all builds fine >>2. Added a stripped version of dotNetRDF.sln for people who would >>clone just the core project and renamed the other dotNetRDF.All.sln > > Regardless of the final repo structure having separate solutions for > people who want partial builds makes sense to me +1 > >> >>Things I didn't finish: >> >>1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl >>files for me. > > I'll fix this in SVN > >>2. dotNetRDF.snk file is missing > > Think that got added after the revision you converted > >>3. The web projects don't load for me at the moment (see question 2. >>below) >>4. I cannot run unit tests (I am currently using VS Express + >>Lightswitch and I'm unable to open the unit tests project) > > Likely a limitation of those versions of VS > Yea. I should buy VS Pro soon. > >>5. File bad-example.ttl is missing in rdfWebDeploy project > > It isn't needed, I'll remove it from the project information. > >>6. I haven't touched the Build folder because I don't want to break >>something. I think updating path shoul be enough. >> >>I have some questions/suggestions: >> >>1. I think that the *.suo and *.user files shouldn't be committed I >>think. Is there a reason why they are? > > Not really > >>2. For portability I suggest using IIS Express for the WebDemos, BSBM >>and sp2b projects. Won't require having IIS installed. > > So VS Express actually won't load projects that use IIS? Does it give you > some sort of error when you try and run the project? > It says "The Web Application Project WebDemos is configured to use IIS. The Web server 'http://localhost/demos' could not be found" > > I'd much prefer to stick IIS, and I assume here by IIS Express you mean > the VS Development server? No-one is going to deploy in production with > the development server and I would much rather build and test on a real > IIS environment. > No I don't mean VS Development server. I mean IIS Express (http://learn.iis.net/page.aspx/868/iis-express-overview/), which is a simpler version of IIS 7.5 but fully compatible with it unline the built-in server. It is instaleld online or with the Web Platform installer. It can even be installed directly from VS 2010 (I think SP1 is required) and VS Web Developer Express. All you need to do is right-click on a web project and choose "Use IIS Express". Should you open the solution a new system, Visual Studio also tells you that IIS Express is required and let's you install. No administrator rights required as with full IIS. > > Btw anyone can install IIS Express on most Windows installations by going > to Control Panel > Programs > Turns Windows features on or off and then > select the IIS components they need for the list (assuming Windows 7). > You can do this on Windows XP/Windows Vista as well the control panel > naming is just slightly different - I believe it is under Add/Remove > Programs on those versions. With this installed you can run IIS based > projects assuming your user account has sufficient privileges on the > machine. > That is not the case. Please read above :) > >>3. Some of the references are available on Nuget. Maybe we could add >>them to the projects with Nuget? I think it has been brought up on the >>list recently... > > This is only useful for the projects we build within VS2010, this doesn't > help us with building for other platforms via NAnt because of the > interactions it would have with how we template those builds. There's an > open issue to update the NuGet package specifications so we don't include > the binary dependencies in the packages and rather rely on NuGet > dependency resolution for that and that makes sense because it's only > affecting users who consume our packages via NuGet. > > Thanks, > > Rob > >> >>Regards, >>Tom >> >>-------------------------------------------------------------------------- >>---- >>Live Security Virtual Conference >>Exclusive live event will cover all the ways today's security and >>threat landscape has changed and how IT managers can respond. Discussions >>will include endpoint security, mobile security and the latest in malware >>threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>_______________________________________________ >>dotNetRDF-develop mailing list >>dot...@li... >>https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Khalil A. <ka...@ne...> - 2012-08-28 08:42:05
|
Sorry for the late reply - had a welcome long weekend off from email :) It would be great to host some instances that can be used for test. I guess that the problem will be in ensuring that tests can be run both concurrently and consecutively on any hosted service. For now I'll probably work on setting up a local dev machine / VM with at least fuseki set up on it. Cheers Kal On Fri, Aug 24, 2012 at 6:55 PM, Rob Vesse <rv...@do...> wrote: > Sorry I gave a bad path for the Fuseki download, correct path is > https://dist.apache.org/repos/dist/release/jena/binaries/ > > From: Cray Employee <rv...@do...> > > Reply-To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Date: Friday, August 24, 2012 9:58 AM > > To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Cc: Ron Michael Zettlemoyer <ron...@fy...> > Subject: Re: [dotNetRDF-Develop] Regression testing before commit > > Hey Kal > > To be honest most of the time changes to the core APIs shouldn't affect > the Storage APIs though ideally you would run with at least some subset of > them just in case. > > Fuseki is super easy to download and run requiring no install, grab the > fuseki distro from jena.apache.org/dist/, unzip to a folder then just run > with the following: > > java –jar fuseki-server.jar --mem —update > > Then just make sure you have Fuseki enabled in the > UnitTestConfig.properties and it should run those tests though sometimes > you may need to do a Rebuild to get it to pick up changes to the file (a VS > oddity). There are others that are relatively simple but all require some > install of some kind. > > One thing I have been thinking of doing but haven't got round to yet is > setting up some CI servers where we have copies of all the supported stores > available and running. Then we could actually check in a > UnitTestConfig.properties rather than just having a template for it so any > developer would automatically be able to run the full tests against the set > of remotely hosted stores without requiring local installs. If something > does need debugging the developer can always alter their local copy of > UnitTestConfig.properties to point to a local install of a store to make > that easier for them. > > Infrastructure wise I'm not sure what the best way of doing this is, I > would guess using one of the cloud services like EC2 or Azure would be > easiest but I'm not sure how cost effective they are. I know Ron uses EC2 > for some stuff his company does so he may have some useful input on this. > Most likely we would need at least two servers because some stores are > cross platform and easier to set up on Windows while others are Linux only. > At a push we may be able to get away with a single Linux box but having a > Windows box as well would be useful because we could also use that to host > a CI system like Jenkins and have it run automated builds and tests as > people check in code. > > What do you think? > > Rob > > From: Kal Ahmed <ka...@ne...> > Reply-To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Date: Tuesday, August 21, 2012 11:30 PM > To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Subject: [dotNetRDF-Develop] Regression testing before commit > > Hi Rob, > > OK, so the separate question is about the extent of regression testing for > commits. The current unit testing framework for the project allows for the > third party stores to be optionally present. My question is that do you > have a preference for what level of regression testing is done on changes > to core ? Currently I don't have any of the third-party stores installed on > any of my dev machines / VMs, so although I can run quite a significant > proportion of the unit tests, there are quite a number that get skipped. Is > that a big problem ? Should I install at least one (or more) of these > stores and ensure that the unit tests for that store run correctly before > committing ? Ideally I would like to avoid having to install them all...but > am I just being lazy ? > > Cheers > > KAl > > -- > Kal Ahmed > Director, Networked Planet Limited > e: kal...@ne... > w: www.networkedplanet.com > t: +44 1865 811131 > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the > ways today's security and threat landscape has changed and how IT managers > can respond. Discussions will include endpoint security, mobile security > and the latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_______________________________________________dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the > ways today's security and threat landscape has changed and how IT managers > can respond. Discussions will include endpoint security, mobile security > and the latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_______________________________________________dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > -- Kal Ahmed Director, Networked Planet Limited e: kal...@ne... w: www.networkedplanet.com t: +44 1865 811131 |
From: Tomasz P. <tom...@gm...> - 2012-08-28 07:30:59
|
Hi I have been thinking about TeamCity myself lately and there is even a freely available CI hosting for OS projects. It is located at http://teamcity.codebetter.com/ and already hosts a fair amount of projects. However I doubt it will be possible to run tests that depend on third party stores. If you were to set up that Amazon hosting or simmilar do consider TeamCity open source licence. JetBrains give one year licence for free: http://www.jetbrains.com/teamcity/buy/index.jsp. Other than that https://continuous.io/ looks promising :) Regards, Tom On Mon, Aug 27, 2012 at 4:05 PM, Ron Michael Zettlemoyer <ron...@fy...> wrote: > I haven't done anything with CI but I do use EC2 a lot and I'm pretty happy > with its cost and flexibility as compared to competitors like Azure, > Rackspace and Google. I have a little cheat sheet to break out some of the > costs: > > https://docs.google.com/a/fynydd.com/spreadsheet/ccc?key=0Aka6vCGW26qqdFR3TnNoakhQT1ctdFdDQ2JTakxHaWc > > If you get some micro instances (their smallest and cheapest) and only start > them when a build is needed, it should be pretty cheap albeit not very fast. > Even running 24x7, a micro instance can cost as little as $100/year plus > storage & bandwidth. > > There are managed CI options like https://continuous.io/ and > https://www.shiningpanda-ci.com/ although I'm not sure either of them will > be flexible enough, and the first one might even be dead. > > > > On Fri, Aug 24, 2012 at 12:58 PM, Rob Vesse <rv...@do...> wrote: >> >> Hey Kal >> >> To be honest most of the time changes to the core APIs shouldn't affect >> the Storage APIs though ideally you would run with at least some subset of >> them just in case. >> >> Fuseki is super easy to download and run requiring no install, grab the >> fuseki distro from jena.apache.org/dist/, unzip to a folder then just run >> with the following: >> >> java –jar fuseki-server.jar --mem —update >> >> Then just make sure you have Fuseki enabled in the >> UnitTestConfig.properties and it should run those tests though sometimes you >> may need to do a Rebuild to get it to pick up changes to the file (a VS >> oddity). There are others that are relatively simple but all require some >> install of some kind. >> >> One thing I have been thinking of doing but haven't got round to yet is >> setting up some CI servers where we have copies of all the supported stores >> available and running. Then we could actually check in a >> UnitTestConfig.properties rather than just having a template for it so any >> developer would automatically be able to run the full tests against the set >> of remotely hosted stores without requiring local installs. If something >> does need debugging the developer can always alter their local copy of >> UnitTestConfig.properties to point to a local install of a store to make >> that easier for them. >> >> Infrastructure wise I'm not sure what the best way of doing this is, I >> would guess using one of the cloud services like EC2 or Azure would be >> easiest but I'm not sure how cost effective they are. I know Ron uses EC2 >> for some stuff his company does so he may have some useful input on this. >> Most likely we would need at least two servers because some stores are cross >> platform and easier to set up on Windows while others are Linux only. At a >> push we may be able to get away with a single Linux box but having a Windows >> box as well would be useful because we could also use that to host a CI >> system like Jenkins and have it run automated builds and tests as people >> check in code. >> >> What do you think? >> >> Rob >> >> From: Kal Ahmed <ka...@ne...> >> Reply-To: dotNetRDF Developer Discussion and Feature Request >> <dot...@li...> >> Date: Tuesday, August 21, 2012 11:30 PM >> To: dotNetRDF Developer Discussion and Feature Request >> <dot...@li...> >> Subject: [dotNetRDF-Develop] Regression testing before commit >> >> Hi Rob, >> >> OK, so the separate question is about the extent of regression testing for >> commits. The current unit testing framework for the project allows for the >> third party stores to be optionally present. My question is that do you have >> a preference for what level of regression testing is done on changes to core >> ? Currently I don't have any of the third-party stores installed on any of >> my dev machines / VMs, so although I can run quite a significant proportion >> of the unit tests, there are quite a number that get skipped. Is that a big >> problem ? Should I install at least one (or more) of these stores and ensure >> that the unit tests for that store run correctly before committing ? Ideally >> I would like to avoid having to install them all...but am I just being lazy >> ? >> >> Cheers >> >> KAl >> >> -- >> Kal Ahmed >> Director, Networked Planet Limited >> e: kal...@ne... >> w: www.networkedplanet.com >> t: +44 1865 811131 >> >> ------------------------------------------------------------------------------ >> Live Security Virtual Conference Exclusive live event will cover all the >> ways today's security and threat landscape has changed and how IT managers >> can respond. Discussions will include endpoint security, mobile security and >> the latest in malware threats. >> http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_______________________________________________ >> dotNetRDF-develop mailing list dot...@li... >> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > |
From: <tr...@do...> - 2012-08-27 19:51:03
|
<p>The following issue has been added to a project that you are monitoring.</p> <table border="0"> <tr> <td width="90px" valign="top"><b>Title:</b></td> <td>Set up CI servers</td> </tr> <tr> <td><b>Project:</b></td> <td>Core Library (dotNetRDF.dll)</td> </tr> <tr> <td><b>Created By:</b></td> <td>Rob Vesse</td> </tr> <tr> <td><b>Milestone:</b></td> <td>0.8.0 RC 1</td> </tr> <tr> <td><b>Category:</b></td> <td>Testing</td> </tr> <tr> <td><b>Priority:</b></td> <td>High</td> </tr> <tr> <td><b>Type:</b></td> <td>Improvement</td> </tr> <tr> <td><b>Description:</b></td> </tr> <tr> <td colspan="2"><p> In order to ensure the stability of the code base and keep track of any regressions that may occur we want to set up CI servers. These servers serve two purposes:</p> <ol> <li> To run the unit tests on a more regular developer indepdent basis</li> <li> To provide a location to host installs of the various 3rd party stores we support so developers don't need to have those available on their system</li> </ol> <p> The proposed approach to this is to set up a couple of virtual machines on EC2, we will likely only run them a few times a week to keep the costs down. The aim would be to have these fire up, check out the latest code, run the builds and tests and then deploy the successful builds to the relevant repositories (SVN and/or Mercurial)</p></td> </tr> </table> <p> More information on this issue can be found at <a href="http://www.dotnetrdf.org/tracker/Issues/IssueDetail.aspx?id=276" target="_blank">http://www.dotnetrdf.org/tracker/Issues/IssueDetail.aspx?id=276</a></p> <p style="text-align:center;font-size:8pt;padding:5px;"> If you no longer wish to receive notifications, please visit <a href="http://www.dotnetrdf.org/tracker/Account/UserProfile.aspx" target="_blank">your profile</a> and change your notifications options. </p> |
From: Rob V. <rv...@do...> - 2012-08-27 17:12:03
|
Hey Tom I have not had chance to look properly yet but a few comments inline... On 8/25/12 7:47 AM, "Tomasz Pluskiewicz" <tom...@gm...> wrote: >Hi > >After much thought I have created a bitbucket team for dotNetRDF and >not one but multiple repositories. All of them imported as of revision >2374. Further changes will have to be applied manually but that >shouldn't be much problem. Do you have notes on how to perform the SVN to HG conversion? Or a pointer to a good tutorial? >I'm thinking maybe we could meet online >using join.me/VoIP? It would be easiest and quickest way for us to >discuss my proposed version control changes and possibly establish >some common workflow. Who would be interested. Also what are you >timezones? I'm in Poland so it's UTC+1. California, UTC-8 > >In short there are four repositories on bitbucket >(https://bitbucket.org/dotnetrdf): > >- core >- utilities >- documentation >- build So these are the proposed masters right now? I am strongly against hosting the masters off SourceForge, if people want to clone the master repos from SourceForge off to a remote service like BitBucket that is fine by me but I intend to continue using SF as the VCS hosting provider. Is there a way to modify those repositories so they become clones of the masters at SF as and when we get those up and running? > >Core is most of trunk except build and utilities, which got their own >repositories. >Documentation is separate because it was in it's own folder under >SVN's root and there was no way I know to merge it with core but >keeping history. >Build is a bit different. It contains the others as subrepositories >and when pulled it pull the others along and when pushed it pushes the >others too. As such I think that only Rob should be pushing to the >Build repository. I disagree, any committer should be able to push to any of our repositories. The beauty of version control is that if someone does something we disagree on we can always roll it back. >It actually should keep little more than build >scripts and (nightly) builds. It sould also be used to sync tags >created in the other repositories. I have done that by linking the >051, 060, 061, 062, 070 and 071. Updating to any one of them in Build >updates the other repositories to the same tag. Any older tags are >lost because before aroung 2010/08 the repository had a different >layout and the SVN to Hg conversion can't guess that transition. Other >than that I don't think it is vital. The commits are still in SVN for >reference anyway. I have to admit to being slightly confused about why you would bother splitting into separate repositories at all given what you've just described. For example the utilities need the core to build so a developer would have to check both out regardless so why not have the "build" repository just be a trunk repository with trunk pretty much as it exists now. However splitting out the documentation makes sense, I would also suggest splitting our the checked in binaries into two separate repositories - binaries-nightly and binaries-stable - http://dotnetrdf.hg.sourceforge.net/hgweb/dotnetrdf/ > >What have I done: > >1. Update paths in projects/files so that all builds fine >2. Added a stripped version of dotNetRDF.sln for people who would >clone just the core project and renamed the other dotNetRDF.All.sln Regardless of the final repo structure having separate solutions for people who want partial builds makes sense to me +1 > >Things I didn't finish: > >1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl >files for me. I'll fix this in SVN >2. dotNetRDF.snk file is missing Think that got added after the revision you converted >3. The web projects don't load for me at the moment (see question 2. >below) >4. I cannot run unit tests (I am currently using VS Express + >Lightswitch and I'm unable to open the unit tests project) Likely a limitation of those versions of VS >5. File bad-example.ttl is missing in rdfWebDeploy project It isn't needed, I'll remove it from the project information. >6. I haven't touched the Build folder because I don't want to break >something. I think updating path shoul be enough. > >I have some questions/suggestions: > >1. I think that the *.suo and *.user files shouldn't be committed I >think. Is there a reason why they are? Not really >2. For portability I suggest using IIS Express for the WebDemos, BSBM >and sp2b projects. Won't require having IIS installed. So VS Express actually won't load projects that use IIS? Does it give you some sort of error when you try and run the project? I'd much prefer to stick IIS, and I assume here by IIS Express you mean the VS Development server? No-one is going to deploy in production with the development server and I would much rather build and test on a real IIS environment. Btw anyone can install IIS Express on most Windows installations by going to Control Panel > Programs > Turns Windows features on or off and then select the IIS components they need for the list (assuming Windows 7). You can do this on Windows XP/Windows Vista as well the control panel naming is just slightly different - I believe it is under Add/Remove Programs on those versions. With this installed you can run IIS based projects assuming your user account has sufficient privileges on the machine. >3. Some of the references are available on Nuget. Maybe we could add >them to the projects with Nuget? I think it has been brought up on the >list recently... This is only useful for the projects we build within VS2010, this doesn't help us with building for other platforms via NAnt because of the interactions it would have with how we template those builds. There's an open issue to update the NuGet package specifications so we don't include the binary dependencies in the packages and rather rely on NuGet dependency resolution for that and that makes sense because it's only affecting users who consume our packages via NuGet. Thanks, Rob > >Regards, >Tom > >-------------------------------------------------------------------------- >---- >Live Security Virtual Conference >Exclusive live event will cover all the ways today's security and >threat landscape has changed and how IT managers can respond. Discussions >will include endpoint security, mobile security and the latest in malware >threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >_______________________________________________ >dotNetRDF-develop mailing list >dot...@li... >https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Ron M. Z. <ron...@fy...> - 2012-08-27 14:13:22
|
I haven't done anything with CI but I do use EC2 a lot and I'm pretty happy with its cost and flexibility as compared to competitors like Azure, Rackspace and Google. I have a little cheat sheet to break out some of the costs: https://docs.google.com/a/fynydd.com/spreadsheet/ccc?key=0Aka6vCGW26qqdFR3TnNoakhQT1ctdFdDQ2JTakxHaWc If you get some micro instances (their smallest and cheapest) and only start them when a build is needed, it should be pretty cheap albeit not very fast. Even running 24x7, a micro instance can cost as little as $100/year plus storage & bandwidth. There are managed CI options like https://continuous.io/ and https://www.shiningpanda-ci.com/ although I'm not sure either of them will be flexible enough, and the first one might even be dead. On Fri, Aug 24, 2012 at 12:58 PM, Rob Vesse <rv...@do...> wrote: > Hey Kal > > To be honest most of the time changes to the core APIs shouldn't affect > the Storage APIs though ideally you would run with at least some subset of > them just in case. > > Fuseki is super easy to download and run requiring no install, grab the > fuseki distro from jena.apache.org/dist/, unzip to a folder then just run > with the following: > > java –jar fuseki-server.jar --mem —update > > Then just make sure you have Fuseki enabled in the > UnitTestConfig.properties and it should run those tests though sometimes > you may need to do a Rebuild to get it to pick up changes to the file (a VS > oddity). There are others that are relatively simple but all require some > install of some kind. > > One thing I have been thinking of doing but haven't got round to yet is > setting up some CI servers where we have copies of all the supported stores > available and running. Then we could actually check in a > UnitTestConfig.properties rather than just having a template for it so any > developer would automatically be able to run the full tests against the set > of remotely hosted stores without requiring local installs. If something > does need debugging the developer can always alter their local copy of > UnitTestConfig.properties to point to a local install of a store to make > that easier for them. > > Infrastructure wise I'm not sure what the best way of doing this is, I > would guess using one of the cloud services like EC2 or Azure would be > easiest but I'm not sure how cost effective they are. I know Ron uses EC2 > for some stuff his company does so he may have some useful input on this. > Most likely we would need at least two servers because some stores are > cross platform and easier to set up on Windows while others are Linux only. > At a push we may be able to get away with a single Linux box but having a > Windows box as well would be useful because we could also use that to host > a CI system like Jenkins and have it run automated builds and tests as > people check in code. > > What do you think? > > Rob > > From: Kal Ahmed <ka...@ne...> > Reply-To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Date: Tuesday, August 21, 2012 11:30 PM > To: dotNetRDF Developer Discussion and Feature Request < > dot...@li...> > Subject: [dotNetRDF-Develop] Regression testing before commit > > Hi Rob, > > OK, so the separate question is about the extent of regression testing for > commits. The current unit testing framework for the project allows for the > third party stores to be optionally present. My question is that do you > have a preference for what level of regression testing is done on changes > to core ? Currently I don't have any of the third-party stores installed on > any of my dev machines / VMs, so although I can run quite a significant > proportion of the unit tests, there are quite a number that get skipped. Is > that a big problem ? Should I install at least one (or more) of these > stores and ensure that the unit tests for that store run correctly before > committing ? Ideally I would like to avoid having to install them all...but > am I just being lazy ? > > Cheers > > KAl > > -- > Kal Ahmed > Director, Networked Planet Limited > e: kal...@ne... > w: www.networkedplanet.com > t: +44 1865 811131 > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the > ways today's security and threat landscape has changed and how IT managers > can respond. Discussions will include endpoint security, mobile security > and the latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_______________________________________________dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop > > |
From: Tomasz P. <tom...@gm...> - 2012-08-25 20:09:15
|
FYI, I have uploaded all of it to the repositories so do have a look. Tom On Sat, Aug 25, 2012 at 4:51 PM, Tomasz Pluskiewicz <tom...@gm...> wrote: > Oh, just one more thing. The Build repository is quite big so I'm > pushing it in parts. Will push the rest later today because now I'm > going out :) > > Tom > > On Sat, Aug 25, 2012 at 4:47 PM, Tomasz Pluskiewicz > <tom...@gm...> wrote: >> Hi >> >> After much thought I have created a bitbucket team for dotNetRDF and >> not one but multiple repositories. All of them imported as of revision >> 2374. Further changes will have to be applied manually but that >> shouldn't be much problem. I'm thinking maybe we could meet online >> using join.me/VoIP? It would be easiest and quickest way for us to >> discuss my proposed version control changes and possibly establish >> some common workflow. Who would be interested. Also what are you >> timezones? I'm in Poland so it's UTC+1. >> >> I short there are four repositories on bitbucket >> (https://bitbucket.org/dotnetrdf): >> >> - core >> - utilities >> - documentation >> - build >> >> Core is most of trunk except build and utilities, which got their own >> repositories. >> Documentation is separate because it was in it's own folder under >> SVN's root and there was no way I know to merge it with core but >> keeping history. >> Build is a bit different. It contains the others as subrepositories >> and when pulled it pull the others along and when pushed it pushes the >> others too. As such I think that only Rob should be pushing to the >> Build repository. It actually should keep little more than build >> scripts and (nightly) builds. It sould also be used to sync tags >> created in the other repositories. I have done that by linking the >> 051, 060, 061, 062, 070 and 071. Updating to any one of them in Build >> updates the other repositories to the same tag. Any older tags are >> lost because before aroung 2010/08 the repository had a different >> layout and the SVN to Hg conversion can't guess that transition. Other >> than that I don't think it is vital. The commits are still in SVN for >> reference anyway. >> >> What have I done: >> >> 1. Update paths in projects/files so that all builds fine >> 2. Added a stripped version of dotNetRDF.sln for people who would >> clone just the core project and renamed the other dotNetRDF.All.sln >> >> Things I didn't finish: >> >> 1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl >> files for me. >> 2. dotNetRDF.snk file is missing >> 3. The web projects don't load for me at the moment (see question 2. below) >> 4. I cannot run unit tests (I am currently using VS Express + >> Lightswitch and I'm unable to open the unit tests project) >> 5. File bad-example.ttl is missing in rdfWebDeploy project >> 6. I haven't touched the Build folder because I don't want to break >> something. I think updating path shoul be enough. >> >> I have some questions/suggestions: >> >> 1. I think that the *.suo and *.user files shouldn't be committed I >> think. Is there a reason why they are? >> 2. For portability I suggest using IIS Express for the WebDemos, BSBM >> and sp2b projects. Won't require having IIS installed. >> 3. Some of the references are available on Nuget. Maybe we could add >> them to the projects with Nuget? I think it has been brought up on the >> list recently... >> >> Regards, >> Tom |
From: Tomasz P. <tom...@gm...> - 2012-08-25 14:51:51
|
Oh, just one more thing. The Build repository is quite big so I'm pushing it in parts. Will push the rest later today because now I'm going out :) Tom On Sat, Aug 25, 2012 at 4:47 PM, Tomasz Pluskiewicz <tom...@gm...> wrote: > Hi > > After much thought I have created a bitbucket team for dotNetRDF and > not one but multiple repositories. All of them imported as of revision > 2374. Further changes will have to be applied manually but that > shouldn't be much problem. I'm thinking maybe we could meet online > using join.me/VoIP? It would be easiest and quickest way for us to > discuss my proposed version control changes and possibly establish > some common workflow. Who would be interested. Also what are you > timezones? I'm in Poland so it's UTC+1. > > I short there are four repositories on bitbucket > (https://bitbucket.org/dotnetrdf): > > - core > - utilities > - documentation > - build > > Core is most of trunk except build and utilities, which got their own > repositories. > Documentation is separate because it was in it's own folder under > SVN's root and there was no way I know to merge it with core but > keeping history. > Build is a bit different. It contains the others as subrepositories > and when pulled it pull the others along and when pushed it pushes the > others too. As such I think that only Rob should be pushing to the > Build repository. It actually should keep little more than build > scripts and (nightly) builds. It sould also be used to sync tags > created in the other repositories. I have done that by linking the > 051, 060, 061, 062, 070 and 071. Updating to any one of them in Build > updates the other repositories to the same tag. Any older tags are > lost because before aroung 2010/08 the repository had a different > layout and the SVN to Hg conversion can't guess that transition. Other > than that I don't think it is vital. The commits are still in SVN for > reference anyway. > > What have I done: > > 1. Update paths in projects/files so that all builds fine > 2. Added a stripped version of dotNetRDF.sln for people who would > clone just the core project and renamed the other dotNetRDF.All.sln > > Things I didn't finish: > > 1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl > files for me. > 2. dotNetRDF.snk file is missing > 3. The web projects don't load for me at the moment (see question 2. below) > 4. I cannot run unit tests (I am currently using VS Express + > Lightswitch and I'm unable to open the unit tests project) > 5. File bad-example.ttl is missing in rdfWebDeploy project > 6. I haven't touched the Build folder because I don't want to break > something. I think updating path shoul be enough. > > I have some questions/suggestions: > > 1. I think that the *.suo and *.user files shouldn't be committed I > think. Is there a reason why they are? > 2. For portability I suggest using IIS Express for the WebDemos, BSBM > and sp2b projects. Won't require having IIS installed. > 3. Some of the references are available on Nuget. Maybe we could add > them to the projects with Nuget? I think it has been brought up on the > list recently... > > Regards, > Tom |
From: Tomasz P. <tom...@gm...> - 2012-08-25 14:48:32
|
Hi After much thought I have created a bitbucket team for dotNetRDF and not one but multiple repositories. All of them imported as of revision 2374. Further changes will have to be applied manually but that shouldn't be much problem. I'm thinking maybe we could meet online using join.me/VoIP? It would be easiest and quickest way for us to discuss my proposed version control changes and possibly establish some common workflow. Who would be interested. Also what are you timezones? I'm in Poland so it's UTC+1. I short there are four repositories on bitbucket (https://bitbucket.org/dotnetrdf): - core - utilities - documentation - build Core is most of trunk except build and utilities, which got their own repositories. Documentation is separate because it was in it's own folder under SVN's root and there was no way I know to merge it with core but keeping history. Build is a bit different. It contains the others as subrepositories and when pulled it pull the others along and when pushed it pushes the others too. As such I think that only Rob should be pushing to the Build repository. It actually should keep little more than build scripts and (nightly) builds. It sould also be used to sync tags created in the other repositories. I have done that by linking the 051, 060, 061, 062, 070 and 071. Updating to any one of them in Build updates the other repositories to the same tag. Any older tags are lost because before aroung 2010/08 the repository had a different layout and the SVN to Hg conversion can't guess that transition. Other than that I don't think it is vital. The commits are still in SVN for reference anyway. What have I done: 1. Update paths in projects/files so that all builds fine 2. Added a stripped version of dotNetRDF.sln for people who would clone just the core project and renamed the other dotNetRDF.All.sln Things I didn't finish: 1. GraphBenchmarker project. The gzipped datasets now cause mssing ttl files for me. 2. dotNetRDF.snk file is missing 3. The web projects don't load for me at the moment (see question 2. below) 4. I cannot run unit tests (I am currently using VS Express + Lightswitch and I'm unable to open the unit tests project) 5. File bad-example.ttl is missing in rdfWebDeploy project 6. I haven't touched the Build folder because I don't want to break something. I think updating path shoul be enough. I have some questions/suggestions: 1. I think that the *.suo and *.user files shouldn't be committed I think. Is there a reason why they are? 2. For portability I suggest using IIS Express for the WebDemos, BSBM and sp2b projects. Won't require having IIS installed. 3. Some of the references are available on Nuget. Maybe we could add them to the projects with Nuget? I think it has been brought up on the list recently... Regards, Tom |
From: Rob V. <rv...@do...> - 2012-08-24 17:56:56
|
Sorry I gave a bad path for the Fuseki download, correct path is https://dist.apache.org/repos/dist/release/jena/binaries/ From: Cray Employee <rv...@do...> Reply-To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Date: Friday, August 24, 2012 9:58 AM To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Cc: Ron Michael Zettlemoyer <ron...@fy...> Subject: Re: [dotNetRDF-Develop] Regression testing before commit > Hey Kal > > To be honest most of the time changes to the core APIs shouldn't affect the > Storage APIs though ideally you would run with at least some subset of them > just in case. > > Fuseki is super easy to download and run requiring no install, grab the fuseki > distro from jena.apache.org/dist/, unzip to a folder then just run with the > following: > > java jar fuseki-server.jar --mem update > > Then just make sure you have Fuseki enabled in the UnitTestConfig.properties > and it should run those tests though sometimes you may need to do a Rebuild to > get it to pick up changes to the file (a VS oddity). There are others that > are relatively simple but all require some install of some kind. > > One thing I have been thinking of doing but haven't got round to yet is > setting up some CI servers where we have copies of all the supported stores > available and running. Then we could actually check in a > UnitTestConfig.properties rather than just having a template for it so any > developer would automatically be able to run the full tests against the set of > remotely hosted stores without requiring local installs. If something does > need debugging the developer can always alter their local copy of > UnitTestConfig.properties to point to a local install of a store to make that > easier for them. > > Infrastructure wise I'm not sure what the best way of doing this is, I would > guess using one of the cloud services like EC2 or Azure would be easiest but > I'm not sure how cost effective they are. I know Ron uses EC2 for some stuff > his company does so he may have some useful input on this. Most likely we > would need at least two servers because some stores are cross platform and > easier to set up on Windows while others are Linux only. At a push we may be > able to get away with a single Linux box but having a Windows box as well > would be useful because we could also use that to host a CI system like > Jenkins and have it run automated builds and tests as people check in code. > > What do you think? > > Rob > > From: Kal Ahmed <ka...@ne...> > Reply-To: dotNetRDF Developer Discussion and Feature Request > <dot...@li...> > Date: Tuesday, August 21, 2012 11:30 PM > To: dotNetRDF Developer Discussion and Feature Request > <dot...@li...> > Subject: [dotNetRDF-Develop] Regression testing before commit > >> Hi Rob, >> >> OK, so the separate question is about the extent of regression testing for >> commits. The current unit testing framework for the project allows for the >> third party stores to be optionally present. My question is that do you have >> a preference for what level of regression testing is done on changes to core >> ? Currently I don't have any of the third-party stores installed on any of my >> dev machines / VMs, so although I can run quite a significant proportion of >> the unit tests, there are quite a number that get skipped. Is that a big >> problem ? Should I install at least one (or more) of these stores and ensure >> that the unit tests for that store run correctly before committing ? Ideally >> I would like to avoid having to install them all...but am I just being lazy ? >> >> Cheers >> >> KAl >> >> -- >> Kal Ahmed >> Director, Networked Planet Limited >> e: kal...@ne... >> w: www.networkedplanet.com <http://www.networkedplanet.com> >> t: +44 1865 811131 >> ----------------------------------------------------------------------------- >> - Live Security Virtual Conference Exclusive live event will cover all the >> ways today's security and threat landscape has changed and how IT managers >> can respond. Discussions will include endpoint security, mobile security and >> the latest in malware threats. >> http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/____________________ >> ___________________________ dotNetRDF-develop mailing list >> dot...@li...https://lists.sourceforge.net/lists/li >> stinfo/dotnetrdf-develop > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the ways > today's security and threat landscape has changed and how IT managers can > respond. Discussions will include endpoint security, mobile security and the > latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_____________________ > __________________________ dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Rob V. <rv...@do...> - 2012-08-24 17:19:04
|
Yep, it is perfectly clear from your description I think we understand what you are getting at we just aren't necessarily used to working in that way though I expect it isn't that hard to make the change Rob On 8/21/12 2:28 PM, "Tomasz Pluskiewicz" <tom...@gm...> wrote: >Hi Rob and Kal > >What you describe below is where DVCS shine. We could achieve exactly >the CTR style of workflow easily. A safe and flexible approach is to >have the main repository readonly for anyone but a designated >maintainer, who would merge other user's work. This would be Rob in >our case. With Linux for example each developer creates his own remote >repository, which he then pulls locally to work on the code. Let me >describe it in steps: > >1. Maintainer creates the repository with initial structure available >remotely. Let's call it master.remote >2. Developers A and B clone those repositories and get devA.remote and >devB.remote respectively >3. Developers A and B then pull from their remote repositories to >their computers to get the code and make some changes >4. When they are ready they push the changes back to devA.remote and >devB.remote >5. Now the maintainer pulls from devA.remote and devB.remote to his >local repository master.local, reviews what has changed and if it is >good, merges and pulls to master.remote for others to get > >Now there are a some important facts. First, each developer can create >multiple remote and local clones to easily work on unrelate changes >without worrying that somerhing would break. They can do frequent >commits and easily discard what is not needed anymore and should not >see the light of day :). Second, because anyone can create their own >clone of the repository and then ask the maintainer to pull their >changes there is no more need for any ACL and granting access to >commit. Someone new to the project can invent a feature (maybe discuss >it on this list?), clone, implement that feature and share their code >with ease. All with much less risk of interrupting other's work. >Actually a third party could even clone to create their own private >fork. Add their secret features and always stay up-to-date because >they can commit in their own clone and pull what changes in the >official repository. > >Tell me if it's clear what I've written here... > >Regards, >Tom > >On Tue, Aug 21, 2012 at 7:40 PM, Rob Vesse <rv...@do...> wrote: >> Kal >> >> Comments inline >> >> From: Kal Ahmed <ka...@ne...> >> Reply-To: dotNetRDF Developer Discussion and Feature Request >> <dot...@li...> >> Date: Tuesday, August 21, 2012 12:59 AM >> To: dotNetRDF Developer Discussion and Feature Request >> <dot...@li...> >> Subject: Re: [dotNetRDF-Develop] Open/Proposed Committer Invites >> >> Hi Rob, >> >> Thanks for inviting us to join you on the project! I look forward to >> hopefully making some useful contributions :) >> >> One thing I would be keen for us to discuss and agree is what >>procedures we >> want to adopt as a group for dealing both with bugs and with feature >> requests / extensions of the project. Its probably not possible to cover >> every potential possibility but at least having some idea about whether >>it >> is OK to go in a fix bugs from the list; what you should do if you find >>a >> bug and have a patch (i.e. report it first or just patch it); and what >>to do >> about ideas for extensions / features etc. >> >> In general I think it would work best if there is communication on this >> list, even if it is just along the lines of "guys, I found this bug, >>did X >> to fix it and I'm ready to commit" to give others a chance to explain >>why it >> is a feature and not a bug :) For bigger things, I think a longer >>discussion >> would probably be useful to make sure we are on the same page. >> >> >> Personally I tend to prefer the Apache style of CTR (commit then review) >> rather than the RTC (review than commit) style depending on what you are >> working on, we all have access to the repositories and the logs so we >>all >> can review other commits. On this point I would say please make sure >>you >> write descriptive commit messages, especially if your commit touches a >>lot >> of files. I always like being able to glance at the log and not have >>to dig >> into the actual diffs to know what changed. Related to this if you >> change/fix something try and remember to add it to the Change Log, >>there's a >> core library specific one under Libraries/core and a toolkit one under >> Utilities. >> >> I would agree that for new features and extensions it is sensible to >>have >> some degree of discussion on the list so we make sure everyone is in >> agreement on whether to add a feature/extension and how to go about it. >> >> However for bug fixes I would generally prefer that developers use their >> discretion. Make sure to write some unit test(s) that reproduce the >>bug, >> try to make a fix for the bug and check that your new tests pass AND >>that >> you haven't caused any existing tests to fail. If existing tests fail >>try >> to figure out why, in some cases it may be that the existing tests were >> silently assuming the old buggy behavior in which case fix those as >>well. If >> it isn't clear why the fix causes a regression elsewhere or you just >>aren't >> sure what the implications of making a change are then it would be >>sensible >> to start a discussion on the list before you commit. But if you can >>fix the >> bug and all the tests still pass then please go ahead and commit it. >>If we >> find out the fix causes other issues at a later point we can always >>revert >> or rework that commit as appropriate. >> >> I will be the first to admit that RDF/SPARQL etc can be somewhat arcane >>at >> times and I know me and Kal have gone back and forth on some thorny >>bugs in >> the past where one bug was hiding another bug and so it wasn't clear >>whether >> the behavior he was seeing was actually a bug or not. If in doubt ask, >>if >> you can fix it without introducing regressions fix it. >> >> Rob >> >> >> What do y'all think ? >> >> Cheers >> >> Kal >> >> On Mon, Aug 20, 2012 at 10:10 PM, Rob Vesse <rv...@vd...> >> wrote: >>> >>> As you've all probably noticed I've been trying to grow the committers >>> list lately because I don't want to have to do all the work myself nor >>>do I >>> need/want to review every minor patch. If someone has proved their >>>ability >>> by contributing over time I want to give them the opportunity to >>>become more >>> involved with the project if they desire. >>> >>> With this in mind we currently have the following committer invites >>>open >>> to Tomasz Pluskiewicz. >>> >>> I would also like to take the opportunity to invite Graham Moore (Kal's >>> colleague at NetworkedPlanet) to become a committer if he too wishes, I >>> should have done this when I invited Kal but it slipped my mind at the >>>time. >>> >>> I have one/two other intermittent committers who I may approach >>>privately >>> off list first to see if they are still working actively with the >>>project >>> and if they are interested in joining as committers. If anyone else >>>has >>> potential committers in mind who they know have experience with the >>>API and >>> may like to get more involved suggestions are welcome >>> >>> Thanks, >>> >>> Rob >>> >>> >>> >>>------------------------------------------------------------------------ >>>------ >>> Live Security Virtual Conference >>> Exclusive live event will cover all the ways today's security and >>> threat landscape has changed and how IT managers can respond. >>>Discussions >>> will include endpoint security, mobile security and the latest in >>>malware >>> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >>> _______________________________________________ >>> dotNetRDF-develop mailing list >>> dot...@li... >>> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >>> >> >> >> >> -- >> Kal Ahmed >> Director, Networked Planet Limited >> e: kal...@ne... >> w: www.networkedplanet.com >> t: +44 1865 811131 >> >>------------------------------------------------------------------------- >>----- >> Live Security Virtual Conference Exclusive live event will cover all the >> ways today's security and threat landscape has changed and how IT >>managers >> can respond. Discussions will include endpoint security, mobile >>security and >> the latest in malware threats. >> >>http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/________________ >>_______________________________ >> dotNetRDF-develop mailing list dot...@li... >> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >> >> >> >>------------------------------------------------------------------------- >>----- >> Live Security Virtual Conference >> Exclusive live event will cover all the ways today's security and >> threat landscape has changed and how IT managers can respond. >>Discussions >> will include endpoint security, mobile security and the latest in >>malware >> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >> _______________________________________________ >> dotNetRDF-develop mailing list >> dot...@li... >> https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop >> > >-------------------------------------------------------------------------- >---- >Live Security Virtual Conference >Exclusive live event will cover all the ways today's security and >threat landscape has changed and how IT managers can respond. Discussions >will include endpoint security, mobile security and the latest in malware >threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ >_______________________________________________ >dotNetRDF-develop mailing list >dot...@li... >https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |
From: Rob V. <rv...@do...> - 2012-08-24 16:59:41
|
Hey Kal To be honest most of the time changes to the core APIs shouldn't affect the Storage APIs though ideally you would run with at least some subset of them just in case. Fuseki is super easy to download and run requiring no install, grab the fuseki distro from jena.apache.org/dist/, unzip to a folder then just run with the following: java jar fuseki-server.jar --mem update Then just make sure you have Fuseki enabled in the UnitTestConfig.properties and it should run those tests though sometimes you may need to do a Rebuild to get it to pick up changes to the file (a VS oddity). There are others that are relatively simple but all require some install of some kind. One thing I have been thinking of doing but haven't got round to yet is setting up some CI servers where we have copies of all the supported stores available and running. Then we could actually check in a UnitTestConfig.properties rather than just having a template for it so any developer would automatically be able to run the full tests against the set of remotely hosted stores without requiring local installs. If something does need debugging the developer can always alter their local copy of UnitTestConfig.properties to point to a local install of a store to make that easier for them. Infrastructure wise I'm not sure what the best way of doing this is, I would guess using one of the cloud services like EC2 or Azure would be easiest but I'm not sure how cost effective they are. I know Ron uses EC2 for some stuff his company does so he may have some useful input on this. Most likely we would need at least two servers because some stores are cross platform and easier to set up on Windows while others are Linux only. At a push we may be able to get away with a single Linux box but having a Windows box as well would be useful because we could also use that to host a CI system like Jenkins and have it run automated builds and tests as people check in code. What do you think? Rob From: Kal Ahmed <ka...@ne...> Reply-To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Date: Tuesday, August 21, 2012 11:30 PM To: dotNetRDF Developer Discussion and Feature Request <dot...@li...> Subject: [dotNetRDF-Develop] Regression testing before commit > Hi Rob, > > OK, so the separate question is about the extent of regression testing for > commits. The current unit testing framework for the project allows for the > third party stores to be optionally present. My question is that do you have a > preference for what level of regression testing is done on changes to core ? > Currently I don't have any of the third-party stores installed on any of my > dev machines / VMs, so although I can run quite a significant proportion of > the unit tests, there are quite a number that get skipped. Is that a big > problem ? Should I install at least one (or more) of these stores and ensure > that the unit tests for that store run correctly before committing ? Ideally I > would like to avoid having to install them all...but am I just being lazy ? > > Cheers > > KAl > > -- > Kal Ahmed > Director, Networked Planet Limited > e: kal...@ne... > w: www.networkedplanet.com <http://www.networkedplanet.com> > t: +44 1865 811131 > ------------------------------------------------------------------------------ > Live Security Virtual Conference Exclusive live event will cover all the ways > today's security and threat landscape has changed and how IT managers can > respond. Discussions will include endpoint security, mobile security and the > latest in malware threats. > http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/_____________________ > __________________________ dotNetRDF-develop mailing list > dot...@li... > https://lists.sourceforge.net/lists/listinfo/dotnetrdf-develop |