Apache Allura / Chat is hosted on FreeNode IRC channel #allura
The new markdown caching stuff doesn't appear to be working properly for wiki.
It displays fine immediately after save, but not after subsequent refreshes.
yeah, certainly looks wrong :(
i can look into that
well so far i have no idea
the same text on a sandbox renders properly
i don't see anything that explains this
page.html_text doesn't get passed through the | safe filter, but it didn't before either
locally is it getting cached? same text may render at different speed
yeah i have my local set to always cache
Have you compared the cached value in prod vs local?
Is it somehow caching something different in one env?
i was just looking at that anyway
does markdown_wiki.convert return a MarkupSafe object and markdown_wiki.cached_convert does not?
cached_convert calls convert, so i wouldn't think that'd be possible
unless it's pulling it from the db
on my sandbox this is not in fact being cached
and i'm not sure why :P
how are you allegedly setting it to always cache?
markdown_cache_threshhold = 0
nice one, thanks
first view renders properly, after that it's raw html
perhaps this is the missing ingredient
ACTION is making a ticket for this
+1, with a test
was reading docs and looks like webhelpers' html.literal uses a markupsafe.Markup class
I asume everyone saw Marvin's thread on the incubator-general list?
Seems like it would be good for one or more of us to chime in as well.
yeah, very interesting thread
i'm not sure yet exactly what I want to say
Yeah, nor I
Also, what do we have to do to get the Allura wikipedia page moving. grr. :-p
i think just wait..
Hrm. This seems like something that would really benefit from a functional test
Confirm that the actual rendered HTML is correct
i hate you guys
or at least assert that it's a Markup class, since
literal extends from Markup and Markup is what we use more often
functional test should be too hard it seems, create a WikiPage instance and save it, forcing caching, then request the URL
Your Freudian slip is showing. *shouldn't
i'mjust here to do your bidding guys
brondsem et al: I chimed in on the release vote :)
Humbedooh: What would it take to sway you? Fixing that (and a few other) reference(s) would require updating the release artifact and there have been quite a few other changes since then. Do you think the SF reference is worth such a significant update?
cory_fu: if I were that anally retentive, I would have said -1 and not +0.9 ;)
So, what are we missing for that last 10%?
well, mostly just me agreeing with myself that this isn't a big issue and changing it to a +1
which will probably happen sometime during today
Ok. I'm writing up a ticket as we speak to fix those references
we've got one alraedy
I figured we might
so, there's rich and me voting for it, anybody else?
Jim voted on the ppmc list
Not 100% sure if that counts or not
let's call it a documentation bug and proceed ;)
(posted on ML for sake of archives)
I think the incubator rules require the 3x+1 to be in the vote thread in general@, so best to ping jim and get him to recast his +1
just ping him on irc, I suppose :)
pfft, marvin and his records
We recently had an issue with the guess_mime_type logic failing because of a lack of file extension, did we not? What was the resolution to that?
it was a typo in the importer
iirc guess_mime_type handles that situation ok
returns a binary mime type
tvansteenburgh: I think we need a repo_refresh call in logic we do have now
the cron ends with
curl -L http://forge-allura.apache.org/p/allura/git/refresh;
do we need something else?
is taskd running?
i tried manually submitting a refresh too
but nothing showing up on our repo browser yet
allura@allura-vm:/var/local$ service reactor status
Taskd NORMAL.1 (#1) is stopped
Taskd NORMAL.2 (#2) is stopped
not sure why...
you're using the same watch & restart scripts as we do on SF sandboxes?
well i had to modify them, but yeah
same thing happens there
too many restart signals sent to the process right in a row
so some hit a newly-restarted proc that doesn't have signal handling set up yet, and it dies
what causes that?
inotify sees multiple changes
i don't know what the proper fix is
probably rewrite the watching script in python instead of bash, so it can be smarter
or cron job to start reactor service every 5 min?
man you need to get root asap
cron would be my hack
i toss that at the end of the current 5min cron entry
yeah, i haven't tried to get root for a while
but we def. need more than just you doing this stuff
+1 for now
especially since i rarely know what i'm doing
ok, cron updated
this is to fix an issue in recently-merged code that changed To headers in outgoing email
was just looking at that, lgtm
i thought i knew how to fix the cc thing too, but i think the sandbox forwarding is messing with the smtp envelope to
at least i assume that's how it works
make sure you have valid email addresses
some of our sample data is foo@localhost which exim might not like
brondsem: Just to be clear, [#6845] is going to open up at least those two ref styles (rXXX and issue XXX) styles in general markdown usage, and unless there's an objection, I'm going to open up the other Trac styles as well.
would we want to consider handling those links during import?
like we do for trac import
We don't for trac
not sure how that fits in to the way we import that GC content
right, we get html links
do we have the html content when importing GC?
So, it's a different issue, actually
i'd probably lean that way, leverage the remote site's cross-linking by just taking their links. and keep our core parsing/rendering cleaner
tvansteenburgh did a good job with making a clean implementation for it, and it would be easy to extend to the issue XXX style as well
'issue NNN' would be pretty broad though
i'm just hesitant to complicate our core markdown rendering any more than necessary :)
it's already a beast
One more relatively simple regex isn't going to complicate things
fight fight fight!
I'm not vested, I just think you're misapprehending how easy it would be to add support for that
Only concern is that it might match a bit more often than intended
regex would be context aware so it doesn't run in a code block,
literal, [some other link] etc?
Hrm. I guess they're currently done as a preprocessor so they would have that problem, now that I look at it again
i know you can hook into the right stages of markdown, but it takes some work to get the right one, and also make sure there aren't any conflicts with other processors using that stage
Yeah, ok. I was thinking I'd have to do it anyway and that it would just be a case of including the patterns that tvansteenburgh added, but I see that it's not going to be quite that easy and won't solve the issue from the ticket anyway