Xbmc v13.2 does not work properly with curl v7.38.0. I cannot scrape anything now nor download anything through xbmc. The log indicates timing out to curl. I rebuilt xbmc against this new version of curl, but still, no downloads are possible. Downgrading to v7.37.1 of curl solves the problem, so it seems to be related to this new release.
I have verified that I can download files form the shell with the current version of curl. The xbmc dev group closed my ticket against xbmc with the comment, "report to curl."
Here are the lines from xbmc's log file showing the timeout:
ERROR:CCurlFile::FillBuffer-Failed:Timeoutwasreached(28)
ERROR:CCurlFile::CReadState::Connect,didn'tgetanydatafromstream.
ERROR:Open-failedtoopensourcehttp://mirrors.xbmc.org/addons/gotham/addons.xml.md5
ERROR:CCurlFile::FillBuffer-Failed:Timeoutwasreached(28)
ERROR:CCurlFile::CReadState::Connect,didn'tgetanydatafromstream.
ERROR:Open-failedtoopensourcehttp://mirrors.xbmc.org/addons/frodo/addons.xml.md5
ERROR:CCurlFile::FillBuffer-Failed:Timeoutwasreached(28)
ERROR:CCurlFile::CReadState::Connect,didn'tgetanydatafromstream.
ERROR:Open-failedtoopensourcehttp://mirrors.xbmc.org/addons/gotham/addons.xml|Encoding=gzip
ERROR:CCurlFile::FillBuffer-Failed:Timeoutwasreached(28)
ERROR:CCurlFile::CReadState::Connect,didn'tgetanydatafromstream.
ERROR:Open-failedtoopensourcehttp://mirrors.xbmc.org/addons/frodo/addons.xml|Encoding=gzip
ERROR:RepositoryXBMC.orgAdd-onsreturnednoadd-ons,listingmayhavefailed
OS: Arch Linux x86_64
% curl -V
curl 7.38.0 (x86_64-unknown-linux-gnu) libcurl/7.38.0 OpenSSL/1.0.1i zlib/1.2.8 libssh2/1.4.3
Protocols: dict file ftp ftps gopher http https imap imaps pop3 pop3s rtsp scp sftp smtp smtps telnet tftp
Features: AsynchDNS IPv6 Largefile GSS-API SPNEGO NTLM NTLM_WB SSL libz TLS-SRP
Thanks for your report.
Can you please clarify exactly what you do with curl, what you expected to happen and what instead actually happened? Please try to make something that doesn't require xbmc to repeat.
Hello Daniel. The expected action is for xbmc to use pull down data via curl. What actually happened is that call timed out. I wish I could provide you with additional info to help you, but the xbmc dev team has made it clear that the problem lies in with curl, not xbmc. I am not a programmer, just an end-user.
Did the way curl handles ipv4 and ipv6 change between the two releases I mentioned?
I'm sorry, but it doesn't help me a lot to learn what the xbmx devs said unless they included more details in their motivation. I know nothing about xbmc internals or what it does with curl or how. I can't track down a problem by just knowing someone used curl and it didn't do what that someone thinks it should do.
No, I can't think of any differences in IPv4/IPv6 handling among these versions. Why do you suspect that to be a reason?
Hello,
Like graysky I am an Arch Linux & XBMC user with the same issue.
Like him as well, I do not know what XBMC tries to do with libcurl... but I have bisected curl to find the 'bad' commit.
This is what I found:
cacdc27f52ba7b0bf08aa57886bfbd18bc82ebfb is the first bad commit
commit cacdc27f52ba7b0bf08aa57886bfbd18bc82ebfb
Author: Daniel Stenberg daniel@haxx.se
Date: Mon Aug 25 11:34:14 2014 +0200
Now just looking at the name of "Curl_expire_latest()" and the fact that XBMC logs complain about expiring, I would assume this is the correct place to look at.
(Now I don't know what is 'wrong' here, XBMC or curl...)
Please ask if you need more information.
John
Last edit: John E 2014-09-21
Thanks, that's indeed a very good clue and a step forward. Does xbmc set a speed limit for the transfer, you know?
Hi - I am fritsch of team xbmc. Thanks for looking into this.
We are setting some common options, here: https://github.com/xbmc/xbmc/blob/master/xbmc/filesystem/CurlFile.cpp#L429
This is our fillbuffer method:
https://github.com/xbmc/xbmc/blob/master/xbmc/filesystem/CurlFile.cpp#L1418
Our default timeout is 10 seconds.
We set a minimum speed limit in order to recognize timeouts earlier: https://github.com/xbmc/xbmc/blob/master/xbmc/filesystem/CurlFile.cpp#L594
Best Regards
Peter
You mean default timeout as in what you set CURLOPT_LOW_SPEED_TIME to?
Can you elaborate more on when this problem happens? Always? On slow transfers? On big transfers?
We see this issue on scrapping data from movie providers (like The Movie Database).
I don't think that would be any big data, for example that could be a 'file' we would try to get:
http://api.tmdb.org/3/search/movie?api_key=57983e31fb435df4df77afb854740ea9&query=Batman%20The%20Dark%20Knight%20Returns%2c%20Part%201&year=2012&language=en
Because of the amount of data there, I am not sure if slow/fast transfer would matter.
Log:
16:24:16 T:140013845079808 DEBUG: CurlFile::Open(0x7f574c8eabc0) http://api.tmdb.org/3/search/movie?api_key=57983e31fb435df4df77afb854740ea9&query=Batman%20The%20Dark%20Knight%20Returns%2c%20Part%201&year=2012&language=en
<stuff...>
16:24:26 T:140013845079808 ERROR: CCurlFile::FillBuffer - Failed: Timeout was reached(28)
16:24:26 T:140013845079808 ERROR: CCurlFile::CReadState::Connect, didn't get any data from stream.
but that data shouldn't take 10 seconds to load, as you can see loading it in a browser is quick.
I'll leave the rest to fristch as I probably will start to spew nonsense after that.
Thanks,
John
Hi, concerning CURLOPT_LOW_SPEED_TIME , we set:
g_curlInterface.easy_setopt(h,CURLOPT_LOW_SPEED_LIMIT, 1);
To get a timeout when we receive < 1 Byte/sec for 20 seconds
Last edit: Peter Frühberger 2014-09-22
I started trying to write a stand-alone program that would repeat this problem but so far I've failed. This is what I have right now:
https://gist.github.com/bagder/dd7a56d2bb76d99a12a6
it feels likely that this problem is due to a specific option or a specific code flow...
I'll appreciate all the help I can get in getting a reproducible test case
I'm not a very technical person, but i will try to help.
Here the debug on the rss feeds:
15:45:08 T:139904277763840 DEBUG: CurlFile::Open(0x7f3e00c74c40) http://feeds.xbmc.org/xbmc
15:45:08 T:139904277763840 DEBUG: Curl::Debug - TEXT: Hostname was NOT found in DNS cache
15:45:10 T:139904277763840 DEBUG: Curl::Debug - TEXT: Resolving timed out after 2003 milliseconds
15:45:10 T:139904277763840 ERROR: CCurlFile::FillBuffer - Failed: Timeout was reached(28)
15:45:10 T:139904277763840 ERROR: CCurlFile::CReadState::Connect, didn't get any data from stream.
15:45:10 T:139904277763840 DEBUG: Curl::Debug - TEXT: Closing connection 1
So i used this rss feed in xbmc as a test:
http://services.sapo.pt/RSS/Feed/sapo/desporto/teasers
http://213.13.145.106/RSS/Feed/sapo/desporto/teasers
Using the ip makes the rss work in xbmc, so probably a dns problem:
17:07:48 T:140638563247872 DEBUG: Curl::Debug - TEXT: Connection #0 to host 213.13.145.106 left intact
17:07:48 T:140638563247872 DEBUG: Got rss feed: http://213.13.145.106/RSS/Feed/sapo/desporto/teasers
17:07:48 T:140638563247872 DEBUG: RSS feed encoding: UTF-8
17:07:48 T:140638563247872 DEBUG: Parsed rss feed: http://213.13.145.106/RSS/Feed/sapo/desporto/teasers
by domain:
17:12:48 T:139786334021376 DEBUG: CurlFile::Open(0x7f228ac89c40) http://services.sapo.pt/RSS/Feed/sapo/desporto/teasers
17:12:48 T:139786334021376 DEBUG: Curl::Debug - TEXT: Hostname was NOT found in DNS cache
17:12:50 T:139786334021376 DEBUG: Curl::Debug - TEXT: Resolving timed out after 2002 milliseconds
17:12:50 T:139786334021376 ERROR: CCurlFile::FillBuffer - Failed: Timeout was reached(28)
17:12:50 T:139786334021376 ERROR: CCurlFile::CReadState::Connect, didn't get any data from stream.
Downgrade to curl-7.37.1:
17:19:11 T:140130845107968 DEBUG: Curl::Debug - TEXT: Connection #0 to host services.sapo.pt left intact
17:19:11 T:140130845107968 DEBUG: Got rss feed: http://services.sapo.pt/RSS/Feed/sapo/desporto/teasers
Hope that helps
Last edit: zygo 2014-09-22
I'm attaching my suggested fix here, which is a partial revert of commit cacdc27f52b and should improve the threaded resolver backend logic.
It'd be great if someone who experienced this problem could give this a spin and see if it truly changes anything!
$curl -V
curl 7.38.0 (x86_64-unknown-linux-gnu) libcurl/7.38.0 OpenSSL/1.0.1i zlib/1.2.8 libssh2/1.4.3 librtmp/2.3
Protocols: dict file ftp ftps gopher http https imap imaps pop3 pop3s rtmp rtsp scp sftp smtp smtps telnet tftp
Features: AsynchDNS IPv6 Largefile GSS-API SPNEGO NTLM NTLM_WB SSL libz TLS-SRP
I applied your patch and xbmc is working ok on arch.
Many thanks for your help
Many thanks for that, I've pushed this fix to git now and unless someone else has something to add about this problem I consider the case closed.
Yes, I too can confirm the patch works to fix this problem. Thank you everyone who contributed for the rapid and accurate fix.
Lovely, thanks all. Case closed!
The patch is now in Arch, thanks everyone! (especially Daniel for the quick turnaround)