On 12/17/2012 07:36 AM, Michael
Oops -- I totally misread the S3 requirements: it's 15GB/month, so
we're fine there, but as Eric pointed out, there's also a 20,000
request limit per month, which we're well over (we've have 67,500
requests since Nov 3's 1.2.0 release).
On 12/16/2012 02:50 PM, Damon
On Sun, Dec 16, 2012 at 1:38 PM, Todd <firstname.lastname@example.org> wrote:
On Sun, Dec 16, 2012 at 8:21 PM, Damon McDougall <email@example.com>
On Sat, Dec 15, 2012 at 8:25 PM, Jason Grout
On 12/14/12 10:55 AM, Nathaniel Smith wrote:
sourceforge's horror of an interface.
I'll second that. Every time I go to Sourceforge, I have to figure out
how in the world to download what I want (and I have to figure out which
things *not* to click on too).
Ok sounds like there is a reasonable amount of resistance towards
Eric, when you suggest that NumFocus could 'provide hosting directly',
do you mean they would have the physical hardware to host the files,
or are you suggesting they provide the finances to seek hosting
In the GitHub blog post, they suggest using S3. We could try that.
It's fairly inexpensive and the first year is free (within monthly
bandwidth limits). We could try it for a year and see how that pans
out? I'm not entirely sure how the Amazon stuff works but I've heard
good things about it.
Are you sure the monthly bandwidth limits are sufficient?
Also, have you talked to the pypi people about making exceptions for really
popular projects? If critical packages like numpy, scipy, and matplotlib
cannot use pypi, that seems like a major failing of the system.
Here's the pricing: http://aws.amazon.com/s3/#pricing. The free tier
programme limits are on there too. Unfortunately, I do not have the
knowledge to be able to say whether we would hit that or not.
Since Nov 3, when 1.2.0 was released, we've used 1.7 GB of
transfer from the github download site. The S3 "free tier" limit
of 1.5 GB/month is awfully close to that.