The correct solution would be to use SSL/TLS. Together with other HTTP/HTTPS related bugs I think, that Streamripper should use a HTTP/HTTPS library like LibCURL for all this tasks.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Here's a simple way to get a list of https radio stations if the one in my example expire:
wget "http://www.radio-browser.info/webservice/pls/stations/bytagexact/university radio" -O university_radio.pls
grep https university_radio.pls
Streamripper can not correctly deal with HTTPS URLs. HTTPS URLs are handled the same way like HTTP URLs. The problem with the example URL is, that https://hirschmilch.de/psytrance/listen.pls is transformed to http://hirschmilch.de/psytrance/listen.pls within Streamripper. But the server issues a redirect to the secure URL (Location: https://hirschmilch.de/psytrance/listen.pls). Streamripper follows the redirect, but to the unencrypted URL http://hirschmilch.de/psytrance/listen.pls. As a result Streamripper goes into an endless loop.
The correct solution would be to use SSL/TLS. Together with other HTTP/HTTPS related bugs I think, that Streamripper should use a HTTP/HTTPS library like LibCURL for all this tasks.
Thank you once again, good suggestion.