SF Wiki Helper Blog
Script enabling off-line preparation and version control of wiki pages
Status: Alpha
Brought to you by:
haaihenkie
When you query a wiki page with the Allura REST API, you receive a JSON representation of the requested page. To parse json from a shell I could find two good alternatives: jq
and the Python json
library.
I did some initial tests and I found a problem with jq
getting the markdown text. The default output looked OK, but the raw output, that I actually needed, looked like gibberish. With Python I immediately got the right output, so I decided to stick with Python.
Also with Python you can get it on a single line command, but obviously it is more complex. Stil to get a string or a list of strings is quite easy.
curl -s -k -X GET \
https://sourceforge.net/rest/p/demo-project/wiki/Project%20Web%20Services-Draft/ \
| jq -r '.text'
curl -s -k -X GET \
https://sourceforge.net/rest/p/demo-project/wiki/Project%20Web%20Services-Draft/ \
| python -c "import sys, json; print(json.load(sys.stdin)['text'])"