Content-Type: multipart/mixed; boundary="Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47" --Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=us-ascii Downstream patches roundup = and ushering

The goal of this project would = to review available patches and bugs from the key downstream maintainers = and either contribute these as patches to strace when relevant or work = with the downstream maintainers to contribute their patches back to = strace. Ideally mini tools should be created to collect these bugs and = patches automatically on a regular basis to inform the strace community = of bugs and conversation happening elsewhere than on the strace mailing = list.

So my idea is to build a mini = tool,collect these bugs and patches,and i will check if these patches = are right,and test it automatically.And = separate the bug by type etc.At last build a regular output to post = some where like community.
Previous = work:
I have do the whole work like = the paper. http://dl.acm.org/cit= ation.cfm?id=3D2310707
In that project,we collect lots of = patches and check if it's = correct.


And i have notice that = this kind of problem seems populate,like https= ://lists.debian.org/debian-devel/2013/06/msg00720.html
Crash= analyze and patches analyze and effectiveness test is also = required.
So this kind of downstream work is important and = fun.The key problem is how to review and how to do this thing = automatically.


For solved = problems,i just grab the patches.
For automatic solve = unsolved issues,i will down both version before and after that = branch,and run all test and test cases,
if they all pass,i = will assume that patch is correct and the corresponding issue is = solved.

some of my example code and = patches will like this:

= --Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47 Content-Disposition: attachment; filename=webkit.py Content-Type: text/x-python-script; name="webkit.py" Content-Transfer-Encoding: 7bit import pymongo import requests import os import urlparse import sys import urllib2 import re import time from functools import wraps from spatch import SPatch debug=False def proxy_urllib2(url): proxy = urllib2.ProxyHandler({'http': '127.0.0.1:8087'})#goagent proxy opener = urllib2.build_opener(proxy) urllib2.install_opener(opener) html = urllib2.urlopen(url).read() return html def retry(ExceptionToCheck, tries=4, delay=3, backoff=2, logger=None): def deco_retry(f): @wraps(f) def f_retry(*args, **kwargs): mtries, mdelay = tries, delay while mtries > 1: try: return f(*args, **kwargs) except ExceptionToCheck, e: msg = "%s, Retrying in %d seconds..." % (str(e), mdelay) if logger: logger.warning(msg) else: print msg time.sleep(mdelay) mtries -= 1 mdelay *= backoff return f(*args, **kwargs) return f_retry # true decorator return deco_retry @retry(urllib2.URLError, tries=4, delay=3, backoff=2) def get_patch_from_detail_id_link(link): #link like this http://code.google.com/p/chromium/issues/detail?id=50553 html = proxy_urllib2(link) stringa = 'http://src.chromium.org/viewvc/chrome\?view=rev&revision=\d+' stringa1 = 'https://src.chromium.org/viewvc/chrome\?view=rev&revision=\d+' stringb = 'http://trac.webkit.org/changeset/\d+' patch_re = ", text changed" webkit_re = "Show entry in browser\" href=\"(.*).cpp\?rev=\d+" for link in re.findall(stringa,html): content1 = proxy_urllib2(link) link1 = re.findall(patch_re,content1) if link1: patch_url = "http://src.chromium.org"+link1[0]+"&&view=patch" return proxy_urllib2(patch_url) for link in re.findall(stringa1,html): content1 = proxy_urllib2(link) link1 = re.findall(patch_re,content1) if link1: patch_url = "http://src.chromium.org"+link1[0]+"&&view=patch" return proxy_urllib2(patch_url) for webkit_link in re.findall(stringb,html): content1 = proxy_urllib2(webkit_link) link1 = re.findall(webkit_re,content1) if link1: patch_url = webkit_link+"/"+link1[0][9:]+".cpp?format=diff" return proxy_urllib2(patch_url) return "" def get_patch_from_webkit(link): print "get patch from webkit:",link if link.endswith("cpp"): return proxy_urllib2(link+"?format=diff") html = proxy_urllib2(link) patch_url = "" webkit_re = "Show entry in browser\" href=\"(.*).cpp\?rev=\d+" link1 = re.findall(webkit_re, html) if link1: patch_url = link+"/"+link1[0][9:]+".cpp?format=diff" print "patch url:",patch_url return proxy_urllib2(patch_url) return "" def get_patch_from_bugs(link): html = proxy_urllib2(link) patch="" bugs_re='''attachment.cgi\?id=\d+"\n title="View the content of the attachment">\n Patch''' link1 = re.findall(bugs_re, html) if link1: patch = proxy_urllib2("https://bugs.webkit.org/"+re.findall("attachment.cgi\?id=\d+",link1[0])[0]) return patch def save_patch(patch, diff_path, id): dir = os.path.join(diff_path, "webkit") if not os.path.exists(dir): os.makedirs(dir) #id = re.findall("\d+",url) filename = os.path.join(dir, id + ".diff") print "save as:",filename f = open(filename, 'w') f.write(patch) f.close() if os.path.getsize(filename): return filename else: return "" def update(diff_path, debug=False): print "Updating patches for webkit..." connect = pymongo.Connection() cvedb = connect.cvedb spatchdb = connect['spatchdb'] patches = spatchdb['patches'] cves = cvedb.cves vendor = "webkit" for item in cves.find({"vulnerable_configuration": {'$regex': vendor}}): id = item["id"] refs = item["references"] urls = [] for ref in refs: patch = "" if ref.startswith("http://trac.webkit.org/"): patch=get_patch_from_webkit(ref) if ref.startswith("https://bugs.webkit.org"): patch=get_patch_from_bugs(ref) if ref.startswith("https://code.google.com"): patch=get_patch_from_detail_id_link(ref) if patch: diffs = save_patch(patch,diff_path, id) if diffs: print ref, "patch saved." patch = SPatch(id=item['id'], category='webkit', cvss=item['cvss'], summary=item['summary'], publish_time=item['Published'], diffs=diffs) patches.insert(patch) else: print urls, "failed to grab patch!" print "Done." def test(): print "Updating patches for webkit..." connect = pymongo.Connection() cvedb = connect.cvedb spatchdb = connect['spatchdb'] patches = spatchdb['patches'] cves = cvedb.cves vendor = "webkit" for item in cves.find({"vulnerable_configuration": {'$regex': vendor}}).sort("last-modified", 1): id = item["id"] print id refs = item["references"] urls = [] for ref in refs: tmp = "" if ref.startswith("http://trac.webkit.org/"): patch=get_patch_from_webkit(ref) if ref.startswith("https://bugs.webkit.org"): patch=get_patch_from_bugs(ref) if ref.startswith("https://code.google.com"): patch=get_patch_from_detail_id_link(ref) if tmp: if save_patch(patch,diff_path, ref): print urls, "patch saved." patch = SPatch(id=item['id'], category='webkit', cvss=item['cvss'], summary=item['summary'], publish_time=item['Published'], diffs=diffs) patches.insert(patch) else: print urls, "failed to grab patch!" print "Done." --Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47 Content-Transfer-Encoding: 7bit Content-Type: text/html; charset=us-ascii
--Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47 Content-Disposition: attachment; filename=CVE-2013-3302.diff Content-Type: application/octet-stream; name="CVE-2013-3302.diff" Content-Transfer-Encoding: 7bit diff --git a/fs/cifs/transport.c b/fs/cifs/transport.c index 76d974c..1a52868 100644 --- a/fs/cifs/transport.c +++ b/fs/cifs/transport.c @@ -144,9 +144,6 @@ smb_send_kvec(struct TCP_Server_Info *server, struct kvec *iov, size_t n_vec, *sent = 0; - if (ssocket == NULL) - return -ENOTSOCK; /* BB eventually add reconnect code here */ - smb_msg.msg_name = (struct sockaddr *) &server->dstaddr; smb_msg.msg_namelen = sizeof(struct sockaddr); smb_msg.msg_control = NULL; @@ -291,6 +288,9 @@ smb_send_rqst(struct TCP_Server_Info *server, struct smb_rqst *rqst) struct socket *ssocket = server->ssocket; int val = 1; + if (ssocket == NULL) + return -ENOTSOCK; + cFYI(1, "Sending smb: smb_len=%u", smb_buf_length); dump_smb(iov[0].iov_base, iov[0].iov_len); --Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=us-ascii https://= github.com/torvalds/linux/graphs/commit-activity
accord= ing this url,they have 600 commits per week = average.
And they have 599 such patched(described in that = paper)for 3 years for = linux_kernel.


Skills:
<= div>Python or C  is enough,may some bash script.
buildbot = maybe useful

Write module for A tool for developing and = executing exploit code against a remote target = machine
Write patched for Gitlab = Project.
Plan:
i have = a holidays on Augest about = 4weeks.


gmail.com
Address:Tsinghua University = 1-213/4-204 100084
= --Apple-Mail=_1C394778-433A-4EF6-B93F-B62F971F9A47--