[Pypt-offline-general] SF.net SVN: pypt-offline: [106] trunk
Status: Beta
Brought to you by:
riteshsarraf
|
From: <rit...@us...> - 2006-12-29 18:14:52
|
Revision: 106
http://svn.sourceforge.net/pypt-offline/?rev=106&view=rev
Author: riteshsarraf
Date: 2006-12-29 10:14:53 -0800 (Fri, 29 Dec 2006)
Log Message:
-----------
Realigning the tree to trunk
Added Paths:
-----------
trunk/CHANGELOG
trunk/INSTALL
trunk/KNOWN-BUGS
trunk/LICENSE
trunk/Makefile
trunk/README
trunk/THANKS
trunk/TODO
trunk/build.xml
trunk/launch-pypt-offline-gui.py
trunk/progressbar.py
trunk/pypt-offline.sh
trunk/pypt_core.py
trunk/pypt_logger.py
trunk/pypt_magic.py
trunk/pypt_md5_check.py
trunk/pypt_offline_gui.ui.h
trunk/pypt_progressbar.py
trunk/pypt_variables.py
trunk/pyptofflinegui.py
trunk/pyptofflinegui.ui
Removed Paths:
-------------
trunk/CVSROOT/
trunk/pypt-offline/
Copied: trunk/CHANGELOG (from rev 105, trunk/pypt-offline/CHANGELOG)
===================================================================
--- trunk/CHANGELOG (rev 0)
+++ trunk/CHANGELOG 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,15 @@
+Version ?? -- XX/XX/XXXX
+ * Broke pypt-offlie.py into pypt_core.py and pypt-offline.py for modularity.
+
+Version 0.5beta -- 03/10/2005
+ * Did more code cleanup
+ * Changed the algorithm to check in local cache
+ * Now supports checking in recursive folders for packages (Like apt-proxy's folder structure)
+
+Version 0.4beta -- 02/20/2005
+ * Did some code cleanup
+ * Added support for Windows -- Now we are usable on Microsoft Windows os's also
+ * Remove dependency on "wget". Now using native python library to fetch urls.
+
+Version 0.3alpha -- 01/17/2005
+ * Initial Release
Copied: trunk/INSTALL (from rev 105, trunk/pypt-offline/INSTALL)
===================================================================
--- trunk/INSTALL (rev 0)
+++ trunk/INSTALL 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,13 @@
+## Under Linux
+
+chmod 755 pypt-offline[.py]
+Execute the command with the proper options
+./pypt-offline [OPTIONS]
+
+
+## Under Microsoft Windows
+
+# Execute the file using the python interpreter
+# Assuming python.exe is in your path
+
+C:\> python pypt-offline [OPTIONS]
Copied: trunk/KNOWN-BUGS (from rev 105, trunk/pypt-offline/KNOWN-BUGS)
===================================================================
--- trunk/KNOWN-BUGS (rev 0)
+++ trunk/KNOWN-BUGS 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,4 @@
+Version 0.5b -- 03/27/2005
+ * Executing pypt-offline without any argument causes an exception [FIXED]
+ * Loose Microsoft Windows support (I've not done thorough check on Windows machines. Please let me know of any bugs) [FIXED]
+ * urllib.urlretrieve doesn't raise an exception even if the target url is broken [FIXED]
Copied: trunk/LICENSE (from rev 105, trunk/pypt-offline/LICENSE)
===================================================================
--- trunk/LICENSE (rev 0)
+++ trunk/LICENSE 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,16 @@
+Copyright (C) 2005, 2006 by Ritesh Raj Sarraf <rr...@re...>
+
+This program is free software; you can redistribute it and/or modify
+it under the terms of the GNU General Public License as published by
+the Free Software Foundation; either version 2 of the License, or
+(at your option) any later version.
+
+This program is distributed in the hope that it will be useful,
+but WITHOUT ANY WARRANTY; without even the implied warranty of
+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+GNU General Public License for more details.
+
+You should have received a copy of the GNU General Public License
+along with this program; if not, write to the
+Free Software Foundation, Inc.,
+59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
\ No newline at end of file
Copied: trunk/Makefile (from rev 105, trunk/pypt-offline/Makefile)
===================================================================
--- trunk/Makefile (rev 0)
+++ trunk/Makefile 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,5 @@
+all:
+ pyuic pyptofflinegui.ui > pyptofflinegui.py
+
+clean:
+ rm -f pyptofflinegui.py
\ No newline at end of file
Copied: trunk/README (from rev 105, trunk/pypt-offline/README)
===================================================================
--- trunk/README (rev 0)
+++ trunk/README 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,188 @@
+pypt-offline -- An Offline Package Manager
+(C) 2005, 2006 Ritesh Raj Sarraf <rr...@re...>
+
+
+
+# INTRODUCTION
+
+So you've decided to give this small piece of work a try.
+Good ! Let's get it done faster.
+
+
+
+pypt-offline is an offline package management tool written in the Python Programming Language.
+This program, as of now, is intended for people using Debian (And Debian based) systems.
+
+This program allows leveraging the power of Debian (more precisely apt-get) onto a completely
+disconnected machine. Most of the people with slow or no internet connection (most of those in
+India/Nepal/Pakistan and nearby countries) have not considered using Debian (or Debian derived
+distributions) because Debian's real taste is experienced when it is connected to the internet.
+
+This utility is an attempt in making that problem eradicate. I hope this utility comes to use to you.
+I'd be eager to hear your comments/suggestions. Feel free to drop an email at rrs _AT_ researchut |DOT| com
+
+
+#########################################################################
+
+Let us assume you have a machine at home ( Hereby called Machine-A) with no or very expensive
+internet connection on which you've installed Debian (Hereby all Debian and Debian based systems
+will be called as Debian).
+
+You or your friend works at a local city office where they have High Speed Internet Connection.
+The machine used there is Linux/Windows/Mac box. We'll call this Machine-B henceforth.
+
+pypt-offline allows you to synchronize Machine-A's apt database with the Debian archives.
+It does so by
+* Extracting the details of Machine-A's apt's database which needs to updated
+* Fetches the required database on Machine-B
+* Synchronizes the fetched database back to Machine-A
+
+
+With these 3 steps you can keep a disconnected Debian machine up-to-date on a daily basis.
+
+
+The rest of the document will describe the details along with the commands.
+
+
+
+STEP - 1
+
+
+
+On Machine-A:
+
+
+pypt-offline --set-update /tmp/update-uris
+
+
+With this command, pypt-offline extracts the details from apt's database for the files that
+are required to update apt's package database. The extracted information is stored in /tmp/update-uris
+
+
+pypt-offline --set-upgrade /tmp/upgrade-uris --upgrade-type dselect-upgrade
+
+
+With this command, pypt-offline extracts the details from apt's database for the packages that
+are required to be upgraded. The extraced information is stored in /tmp/upgrade-uris
+There are 3 types of "upgrade" type. "upgrade", "dist-upgrade" and "dselect-upgrade".
+You can pass any one of them.
+Note: --set-upgrade and --upgrade-type are mutually inclusive. You cannot use one option without the other
+
+
+pypt-offline --set-install /tmp/install-uris --set-install-packages package1 package2 packageN
+
+
+With this command, pypt-offline extracts the details from apt's database for the packages
+(and its dependent packages) that are required to be installed. The extracted information is
+stored in /tmp/install-uris
+Note: --set-install and --set-install-packages are mutually inclusive. You cannot use one option
+without the other
+
+
+
+The above mentioned options are executed on Machine-A. They extract the details from apt's database.
+
+
+Now the user needs to copy the extracted data file onto a removable media and take it to Machine-B.
+
+
+
+STEP - 2
+
+
+
+On Machine-B:
+
+With the extracted data file in hand, on Machine-B, execute the following commands:
+
+
+pypt-offline --fetch-update /tmp/update-uris [ -d /tmp/updates/ -s /tmp/repository/ --disable-md5check
+--zip --zip-update-file /tmp/updates.zip --zip-upgrade-file /tmp/upgrades.zip]
+
+
+With this command, pypt-offline fetches the required data from the internet as directed by Machine-A
+The options in square bracket are optional
+
+
+-d /tm/updates/ - With this option, the directory where the downloaded data will be saved is set.
+If the option is not given, a folder named pypt-offline-downloads is created under the current working directory.
+
+
+-s /tmp/repository/ - With this option, the directory where the previously downloaded data was saved
+is searched. This is used so that you don't download the same file again and again. If the option is
+not given, the current working directory is used as the repository and all files and folders under it
+are recursively searched.
+Also the freshly downloaded files are copied to this folder so that you don't have to download it again
+the next time.
+Note: This option is effective only for packages which are downloaded and *NOT* for update files.
+Update files change almost daily in the Debian repositories, hence keeping a local cache is useless.
+
+
+--disable-md5check - It is *highly discouraged* to use this option. By default, pypt-offline compares
+the md5checksum for every package it downloads. If the md5checksum don't match, the file is deleted.
+This is necessary to make sure that you don't end up having tampered packages.
+Note: This option is effective only for packages which are downloaded and *NOT* for update files.
+Update files currently don't provide md5checksum details.
+
+
+--zip - It is *highly encouraged* to use this option. Though disabled by default, if enabled, pypt-offline
+creates a single zip file of the downloaded files. This way it is easier to track the downloaded files.
+
+
+--zip-update-files - This option sets the zip file name. If this option is not set, pypt-offline uses the
+default file name, pypt-offline-update.zip
+
+
+--zip-upgrade-files - This option sets the zip file name. If this option is not set, pypt-offline uses the
+default file name, pypt-offline-upgrade.zip
+
+
+If you don't use the --zip option, the downloaded files are stored in the folder you mentioned with the -d option.
+You'll need to copy the files from the folder.
+
+
+With this, once all the data has been downloaded, copy it to your removable storage device and take it back to Machine-A.
+
+
+
+STEP - 3
+
+
+On Machine-A:
+
+
+Once you come back to Machine-A:
+
+
+
+pypt-offline --install-update /tmp/updates.zip [ /tmp/]
+
+
+With this command, pypt-offline syncs the update files from updates.zip. If a folder name is given as an
+argument to --install-update, it searches for the required files and syncs them to apt's package database.
+
+
+pypt-offline --install-upgrade /tmp/upgrade.zip [/tmp/]
+
+
+With this command, pypt-offline syncs the package from updates.zip. If a folder name is given as an argument
+to --install-update, it searches for the requires files and syncs them to apt's package database.
+Note: Please keep in mind that this doesn't actually install/upgrade the packages.
+It just makes the "To Be Downloaded" package available. With it apt will not require to download any additional
+packages and hence will do the installation as if it was downloaded from the internet.
+
+
+NOTE: If you use apt-listbugs to track package bugs while you install, it'll fail because apt-listbugs will try
+to connect to the internet.
+
+
+
+
+
+That's all. Please send your comments and suggestions to me at rrs _AT_ researchut |DOT| com
+If you come across any bug [misbehavior, feature request], file them at the Bug tracker.
+There's also a mailing list available at https://lists.sourceforge.net/lists/listinfo/pypt-offline-general
+
+
+Thanks,
+Ritesh Raj Sarraf
\ No newline at end of file
Copied: trunk/THANKS (from rev 105, trunk/pypt-offline/THANKS)
===================================================================
--- trunk/THANKS (rev 0)
+++ trunk/THANKS 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,6 @@
+These are, namely some of the people, who've helped me a lot in my process of learning programming.
+*) Peter Otten
+*) Duncan Booth
+*) Simon Forman
+*) Dennis Lee Bieber
+*) Any others whom I've missed
\ No newline at end of file
Copied: trunk/TODO (from rev 105, trunk/pypt-offline/TODO)
===================================================================
--- trunk/TODO (rev 0)
+++ trunk/TODO 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,14 @@
+* Support for RPM packages (Using YUM)
+* Argument Parsing <done>
+* Usage of apt-proxy like services, if available <done>
+* Usage of a .pypt-offlinerc config file
+* Download status indicator <done>
+* Proxy Authentication
+* Implement a function which will keep track of the failed uris and print them at last. It'll keep the uri and it's failure reason as a dictionary <done>
+* Implement Threads - When we implement threads multiple files will be downloaded at the same time. <done>
+ At that point, say we execute with 5 threads, the progressbar for all 5 threads should be displayed together.
+* Implement Curses and GUI interfaces. For these UIs it will be interactive. For all the options, only one will be selectable and once that option
+ is selected the others will be deactivated. That's design, only one option allowed a time. :-)
+* Use Python's logging module to better handle messages, warnings and errors <done>
+* Implement apt's newly added feature (> 0.6.4) of Package Diff
+* Add functionality for Offline Bug Reports
\ No newline at end of file
Copied: trunk/build.xml (from rev 105, trunk/pypt-offline/build.xml)
===================================================================
--- trunk/build.xml (rev 0)
+++ trunk/build.xml 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,22 @@
+<?xml version="1.0"?>
+<project name="pypt-offline" default="make">
+ <description>
+ Ant adaptor for the pypt-offline Makefile
+ </description>
+
+ <target name="make" description="build pyptofflinegui">
+ <exec executable="make">
+ <arg value="-f"/>
+ <arg value="Makefile"/>
+ <arg value="all"/>
+ </exec>
+ </target>
+
+ <target name="clean" description="clean pypt-offline" depends="make">
+ <exec executable="make">
+ <arg value="-f"/>
+ <arg value="Makefile"/>
+ <arg value="clean"/>
+ </exec>
+ </target>
+</project>
Copied: trunk/launch-pypt-offline-gui.py (from rev 105, trunk/pypt-offline/launch-pypt-offline-gui.py)
===================================================================
--- trunk/launch-pypt-offline-gui.py (rev 0)
+++ trunk/launch-pypt-offline-gui.py 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,13 @@
+#!/usr/bin/env python
+from qt import *
+from pyptofflinegui import pyptofflineguiForm
+
+
+if __name__ == "__main__":
+ import sys
+ a = QApplication(sys.argv)
+ QObject.connect(a,SIGNAL("lastWindowClosed()"),a,SLOT("quit()"))
+ w = pyptofflineguiForm()
+ a.setMainWidget(w)
+ w.show()
+ a.exec_loop()
Copied: trunk/progressbar.py (from rev 105, trunk/pypt-offline/progressbar.py)
===================================================================
--- trunk/progressbar.py (rev 0)
+++ trunk/progressbar.py 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,368 @@
+#!/usr/bin/python
+# -*- coding: iso-8859-1 -*-
+#
+# progressbar - Text progressbar library for python.
+# Copyright (c) 2005 Nilton Volpato
+#
+# This library is free software; you can redistribute it and/or
+# modify it under the terms of the GNU Lesser General Public
+# License as published by the Free Software Foundation; either
+# version 2.1 of the License, or (at your option) any later version.
+#
+# This library is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+# Lesser General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public
+# License along with this library; if not, write to the Free Software
+# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+
+
+"""Text progressbar library for python.
+
+This library provides a text mode progressbar. This is tipically used
+to display the progress of a long running operation, providing a
+visual clue that processing is underway.
+
+The ProgressBar class manages the progress, and the format of the line
+is given by a number of widgets. A widget is an object that may
+display diferently depending on the state of the progress. There are
+three types of widget:
+- a string, which always shows itself;
+- a ProgressBarWidget, which may return a diferent value every time
+it's update method is called; and
+- a ProgressBarWidgetHFill, which is like ProgressBarWidget, except it
+expands to fill the remaining width of the line.
+
+The progressbar module is very easy to use, yet very powerful. And
+automatically supports features like auto-resizing when available.
+"""
+
+__author__ = "Nilton Volpato"
+__author_email__ = "first-name dot last-name @ gmail.com"
+__date__ = "2006-05-07"
+__version__ = "2.2"
+
+# Changelog
+#
+# 2006-05-07: v2.2 fixed bug in windows
+# 2005-12-04: v2.1 autodetect terminal width, added start method
+# 2005-12-04: v2.0 everything is now a widget (wow!)
+# 2005-12-03: v1.0 rewrite using widgets
+# 2005-06-02: v0.5 rewrite
+# 2004-??-??: v0.1 first version
+
+
+import sys, time
+from array import array
+try:
+ from fcntl import ioctl
+ import termios
+except ImportError:
+ pass
+import signal
+
+class ProgressBarWidget(object):
+ """This is an element of ProgressBar formatting.
+
+ The ProgressBar object will call it's update value when an update
+ is needed. It's size may change between call, but the results will
+ not be good if the size changes drastically and repeatedly.
+ """
+ def update(self, pbar):
+ """Returns the string representing the widget.
+
+ The parameter pbar is a reference to the calling ProgressBar,
+ where one can access attributes of the class for knowing how
+ the update must be made.
+
+ At least this function must be overriden."""
+ pass
+
+class ProgressBarWidgetHFill(object):
+ """This is a variable width element of ProgressBar formatting.
+
+ The ProgressBar object will call it's update value, informing the
+ width this object must the made. This is like TeX \\hfill, it will
+ expand to fill the line. You can use more than one in the same
+ line, and they will all have the same width, and together will
+ fill the line.
+ """
+ def update(self, pbar, width):
+ """Returns the string representing the widget.
+
+ The parameter pbar is a reference to the calling ProgressBar,
+ where one can access attributes of the class for knowing how
+ the update must be made. The parameter width is the total
+ horizontal width the widget must have.
+
+ At least this function must be overriden."""
+ pass
+
+
+class ETA(ProgressBarWidget):
+ "Widget for the Estimated Time of Arrival"
+ def format_time(self, seconds):
+ return time.strftime('%H:%M:%S', time.gmtime(seconds))
+ def update(self, pbar):
+ if pbar.currval == 0:
+ return 'ETA: --:--:--'
+ elif pbar.finished:
+ return 'Time: %s' % self.format_time(pbar.seconds_elapsed)
+ else:
+ elapsed = pbar.seconds_elapsed
+ eta = elapsed * pbar.maxval / pbar.currval - elapsed
+ return 'ETA: %s' % self.format_time(eta)
+
+class FileTransferSpeed(ProgressBarWidget):
+ "Widget for showing the transfer speed (useful for file transfers)."
+ def __init__(self):
+ self.fmt = '%6.2f %s'
+ self.units = ['B','K','M','G','T','P']
+ def update(self, pbar):
+ if pbar.seconds_elapsed < 2e-6:#== 0:
+ bps = 0.0
+ else:
+ bps = float(pbar.currval) / pbar.seconds_elapsed
+ spd = bps
+ for u in self.units:
+ if spd < 1000:
+ break
+ spd /= 1000
+ return self.fmt % (spd, u+'/s')
+
+class RotatingMarker(ProgressBarWidget):
+ "A rotating marker for filling the bar of progress."
+ def __init__(self, markers='|/-\\'):
+ self.markers = markers
+ self.curmark = -1
+ def update(self, pbar):
+ if pbar.finished:
+ return self.markers[0]
+ self.curmark = (self.curmark + 1)%len(self.markers)
+ return self.markers[self.curmark]
+
+class Percentage(ProgressBarWidget):
+ "Just the percentage done."
+ def update(self, pbar):
+ return '%3d%%' % pbar.percentage()
+
+class Bar(ProgressBarWidgetHFill):
+ "The bar of progress. It will strech to fill the line."
+ def __init__(self, marker='#', left='|', right='|'):
+ self.marker = marker
+ self.left = left
+ self.right = right
+ def _format_marker(self, pbar):
+ if isinstance(self.marker, (str, unicode)):
+ return self.marker
+ else:
+ return self.marker.update(pbar)
+ def update(self, pbar, width):
+ percent = pbar.percentage()
+ cwidth = width - len(self.left) - len(self.right)
+ marked_width = int(percent * cwidth / 100)
+ m = self._format_marker(pbar)
+ bar = (self.left + (m*marked_width).ljust(cwidth) + self.right)
+ return bar
+
+class ReverseBar(Bar):
+ "The reverse bar of progress, or bar of regress. :)"
+ def update(self, pbar, width):
+ percent = pbar.percentage()
+ cwidth = width - len(self.left) - len(self.right)
+ marked_width = int(percent * cwidth / 100)
+ m = self._format_marker(pbar)
+ bar = (self.left + (m*marked_width).rjust(cwidth) + self.right)
+ return bar
+
+default_widgets = [Percentage(), ' ', Bar()]
+class ProgressBar(object):
+ """This is the ProgressBar class, it updates and prints the bar.
+
+ The term_width parameter may be an integer. Or None, in which case
+ it will try to guess it, if it fails it will default to 80 columns.
+
+ The simple use is like this:
+ >>> pbar = ProgressBar().start()
+ >>> for i in xrange(100):
+ ... # do something
+ ... pbar.update(i+1)
+ ...
+ >>> pbar.finish()
+
+ But anything you want to do is possible (well, almost anything).
+ You can supply different widgets of any type in any order. And you
+ can even write your own widgets! There are many widgets already
+ shipped and you should experiment with them.
+
+ When implementing a widget update method you may access any
+ attribute or function of the ProgressBar object calling the
+ widget's update method. The most important attributes you would
+ like to access are:
+ - currval: current value of the progress, 0 <= currval <= maxval
+ - maxval: maximum (and final) value of the progress
+ - finished: True if the bar is have finished (reached 100%), False o/w
+ - start_time: first time update() method of ProgressBar was called
+ - seconds_elapsed: seconds elapsed since start_time
+ - percentage(): percentage of the progress (this is a method)
+ """
+ def __init__(self, maxval=100, widgets=default_widgets, term_width=None,
+ fd=sys.stderr):
+ assert maxval > 0
+ self.maxval = maxval
+ self.widgets = widgets
+ self.fd = fd
+ self.signal_set = False
+ if term_width is None:
+ try:
+ self.handle_resize(None,None)
+ signal.signal(signal.SIGWINCH, self.handle_resize)
+ self.signal_set = True
+ except:
+ self.term_width = 79
+ else:
+ self.term_width = term_width
+
+ self.currval = 0
+ self.finished = False
+ self.prev_percentage = -1
+ self.start_time = None
+ self.seconds_elapsed = 0
+
+ def handle_resize(self, signum, frame):
+ h,w=array('h', ioctl(self.fd,termios.TIOCGWINSZ,'\0'*8))[:2]
+ self.term_width = w
+
+ def percentage(self):
+ "Returns the percentage of the progress."
+ return self.currval*100.0 / self.maxval
+
+ def _format_widgets(self):
+ r = []
+ hfill_inds = []
+ num_hfill = 0
+ currwidth = 0
+ for i, w in enumerate(self.widgets):
+ if isinstance(w, ProgressBarWidgetHFill):
+ r.append(w)
+ hfill_inds.append(i)
+ num_hfill += 1
+ elif isinstance(w, (str, unicode)):
+ r.append(w)
+ currwidth += len(w)
+ else:
+ weval = w.update(self)
+ currwidth += len(weval)
+ r.append(weval)
+ for iw in hfill_inds:
+ r[iw] = r[iw].update(self, (self.term_width-currwidth)/num_hfill)
+ return r
+
+ def _format_line(self):
+ return ''.join(self._format_widgets()).ljust(self.term_width)
+
+ def _need_update(self):
+ return int(self.percentage()) != int(self.prev_percentage)
+
+ def update(self, value):
+ "Updates the progress bar to a new value."
+ #assert 0 <= value <= self.maxval
+ self.currval = value
+ if not self._need_update() or self.finished:
+ return
+ if not self.start_time:
+ self.start_time = time.time()
+ self.seconds_elapsed = time.time() - self.start_time
+ self.prev_percentage = self.percentage()
+ if value != self.maxval:
+ self.fd.write(self._format_line() + '\r')
+ else:
+ self.finished = True
+ self.fd.write(self._format_line() + '\n')
+
+ def start(self):
+ """Start measuring time, and prints the bar at 0%.
+
+ It returns self so you can use it like this:
+ >>> pbar = ProgressBar().start()
+ >>> for i in xrange(100):
+ ... # do something
+ ... pbar.update(i+1)
+ ...
+ >>> pbar.finish()
+ """
+ self.update(0)
+ return self
+
+ def finish(self):
+ """Used to tell the progress is finished."""
+ self.update(self.maxval)
+ if self.signal_set:
+ signal.signal(signal.SIGWINCH, signal.SIG_DFL)
+
+
+
+
+
+
+if __name__=='__main__':
+ import os
+
+ def example1():
+ widgets = ['Test: ', Percentage(), ' ', Bar(marker=RotatingMarker()),
+ ' ', ETA(), ' ', FileTransferSpeed()]
+ pbar = ProgressBar(widgets=widgets, maxval=10000000).start()
+ for i in range(1000000):
+ # do something
+ pbar.update(10*i+1)
+ pbar.finish()
+ print
+
+ def example2():
+ class CrazyFileTransferSpeed(FileTransferSpeed):
+ "It's bigger between 45 and 80 percent"
+ def update(self, pbar):
+ if 45 < pbar.percentage() < 80:
+ return 'Bigger Now ' + FileTransferSpeed.update(self,pbar)
+ else:
+ return FileTransferSpeed.update(self,pbar)
+
+ widgets = [CrazyFileTransferSpeed(),' <<<', Bar(), '>>> ', Percentage(),' ', ETA()]
+ pbar = ProgressBar(widgets=widgets, maxval=10000000)
+ # maybe do something
+ pbar.start()
+ for i in range(2000000):
+ # do something
+ pbar.update(5*i+1)
+ pbar.finish()
+ print
+
+ def example3():
+ widgets = [Bar('>'), ' ', ETA(), ' ', ReverseBar('<')]
+ pbar = ProgressBar(widgets=widgets, maxval=10000000).start()
+ for i in range(1000000):
+ # do something
+ pbar.update(10*i+1)
+ pbar.finish()
+ print
+
+ def example4():
+ widgets = ['Test: ', Percentage(), ' ',
+ Bar(marker='0',left='[',right=']'),
+ ' ', ETA(), ' ', FileTransferSpeed()]
+ pbar = ProgressBar(widgets=widgets, maxval=500)
+ pbar.start()
+ for i in range(100,500+1,50):
+ time.sleep(0.2)
+ pbar.update(i)
+ pbar.finish()
+ print
+
+
+ example1()
+ example2()
+ example3()
+ example4()
+
Added: trunk/pypt-offline.sh
===================================================================
--- trunk/pypt-offline.sh (rev 0)
+++ trunk/pypt-offline.sh 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,28 @@
+#!/usr/bin/env python
+# pypt-offline.py
+#
+############################################################################
+# Copyright (C) 2005, 2006 Ritesh Raj Sarraf #
+# rr...@re... #
+# #
+# This program is free software; you can redistribute it and#or modify #
+# it under the terms of the GNU General Public License as published by #
+# the Free Software Foundation; either version 2 of the License, or #
+# (at your option) any later version. #
+# #
+# This program is distributed in the hope that it will be useful, #
+# but WITHOUT ANY WARRANTY; without even the implied warranty of #
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the #
+# GNU General Public License for more details. #
+# #
+# You should have received a copy of the GNU General Public License #
+# along with this program; if not, write to the #
+# Free Software F[-d] [-s] [-u]oundation, Inc., #
+# 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA. #
+############################################################################
+
+
+from pypt_core import main
+
+if __name__ == "__main__":
+ main()
Property changes on: trunk/pypt-offline.sh
___________________________________________________________________
Name: svn:executable
+ *
Copied: trunk/pypt_core.py (from rev 105, trunk/pypt-offline/pypt_core.py)
===================================================================
--- trunk/pypt_core.py (rev 0)
+++ trunk/pypt_core.py 2006-12-29 18:14:53 UTC (rev 106)
@@ -0,0 +1,814 @@
+import os, shutil, string, sys, urllib2, Queue, threading
+import pypt_progressbar, pypt_md5_check, pypt_variables, pypt_logger, progressbar
+
+'''This is the core module. It does the main job of downloading packages/update packages,\nfiguring out if the packages are in the local cache, handling exceptions and many more stuff'''
+
+def compress_the_file(zip_file_name, files_to_compress, download_dir):
+ '''Condenses all the files into one single file for easy transfer'''
+
+ try:
+ import zipfile
+ except ImportError:
+ log.err("Aieee!! Module not found.\n")
+
+ try:
+ os.chdir(download_dir)
+ except:
+ #TODO: Handle this exception
+ log.err("Aieeee! I got a fatal exception that I don't understand.\nPlease debug.\n")
+
+ try:
+ filename = zipfile.ZipFile(zip_file_name, "a")
+ except IOError:
+ #INFO By design zipfile throws an IOError exception when you open
+ # in "append" mode and the file is not present.
+ filename = zipfile.ZipFile(zip_file_name, "w")
+ except:
+ #TODO Handle the exception
+ log.err("\nAieee! Some error exception in creating zip file %s\n" % (zip_file_name))
+ sys.exit(1)
+
+ filename.write(files_to_compress, files_to_compress, zipfile.ZIP_DEFLATED)
+ filename.close()
+
+def decompress_the_file(file, path, filename, archive_type):
+ '''Extracts all the files from a single condensed archive file'''
+
+
+ if archive_type is 1:
+ try:
+ import bz2
+ except ImportError:
+ log.err("Aieeee! Module bz2 is not available.\n")
+
+ try:
+ fh = bz2.BZ2File(file, 'r')
+ except:
+ log.err("Couldn't open file %s for reading.\n" % (file))
+
+ try:
+ wr_fh = open (os.path.join(path, filename), 'wb')
+ except:
+ log.err("Couldn't open file %s at path %s for writing.\n" % (filename, path))
+
+ try:
+ wr_fh.write(fh.read())
+ except EOFError, e:
+ log.err("Bad file %s\n%s" % (file, e))
+ pass
+
+ wr_fh.close()
+ fh.close()
+ log.msg("%s file synced\n" % (filename))
+
+ elif archive_type is 2:
+ try:
+ import gzip
+ except ImportError:
+ log.err("Aieee! Module gzip is not available.\n")
+
+ try:
+ fh = gzip.GzipFile(file, 'r')
+ except:
+ log.err("Couldn't open file %s for reading.\n" % (file))
+
+ try:
+ wr_fh = open(os.path.join(path,filename), 'wb')
+ except:
+ log.err("Couldn't open file %s at path %s for writing.\n" % (filename, path))
+
+ try:
+ wr_fh.write(fh.read())
+ except EOFError, e:
+ log.err("Bad file %s\n%s" % (file, e))
+ pass
+
+ wr_fh.close()
+ fh.close()
+ log.msg("%s file synced\n" % (filename))
+
+ elif archive_type is 3:
+ try:
+ zip_file = zipfile.ZipFile(file, 'rb')
+ except:
+ #TODO: Handle the exceptions
+ log.err("\nAieee! Some error exception in reading the zip file %s\n" % (file))
+ return False
+
+ for filename in zip_file.namelist():
+ data = zip_file.read()
+
+ zip_file.close()
+ else:
+ log.err("Aieeee! %s is unknown archive.\n" % (file))
+ return False
+
+ return True
+
+def download_from_web(url, file, download_dir, checksum, number_of_threads, thread_name):
+ '''
+ Download the required file from the web
+ The arguments are passed everytime to the function so that,
+ may be in future, we could reuse this function
+ '''
+
+ try:
+ block_size = 4096
+ i = 0
+ counter = 0
+
+ os.chdir(download_dir)
+ temp = urllib2.urlopen(url)
+ headers = temp.info()
+ size = int(headers['Content-Length'])
+ data = open(file,'wb')
+
+ log.msg("Downloading %s\n" % (file))
+ prog = pypt_progressbar.myReportHook(size, number_of_threads)
+ #widgets = ['Test: ', progressbar.Percentage(), ' ', progressbar.Bar(marker=progressbar.RotatingMarker()), ' ', progressbar.ETA(), ' ', progressbar.FileTransferSpeed()]
+ #widgets = [CrazyFileTransferSpeed(),' <<<', Bar(), '>>> ', Percentage(),' ', ETA()]
+ #pbar = progressbar.ProgressBar(widgets=widgets, maxval=size)
+ #pbar.start()
+ while i < size:
+ data.write (temp.read(block_size))
+ i += block_size
+ counter += 1
+ #pbar.update(i)
+ prog.updateAmount(counter * block_size, thread_name)
+ #pbar.finish()
+ #print "\n"
+ data.close()
+ temp.close()
+
+ #INFO: Do an md5 checksum
+ if pypt_variables.options.disable_md5check == True:
+ pass
+ else:
+ if pypt_md5_check.md5_check(file, checksum, download_dir) != True:
+ os.unlink(file)
+ log.err("%s checksum mismatch. File removed\n" % (file))
+ return False
+ log.verbose("%s successfully downloaded from %s\n\n" % (file, url))
+ return True
+
+ #FIXME: Find out optimal fix for this exception handling
+ except OSError, (errno, strerror):
+ #log.err("%s\n" %(download_dir))
+ errfunc(errno, strerror, download_dir)
+
+ except urllib2.HTTPError, errstring:
+ #log.err("%s\n" % (file))
+ errfunc(errstring.code, errstring.msg, file)
+
+ except urllib2.URLError, errstring:
+ #We pass error code "1" here becuase URLError
+ # doesn't pass any error code.
+ # URLErrors shouldn't be ignored, hence program termination
+ if errstring.reason.args[0] == 10060:
+ errfunc(errstring.reason.args[0], errstring.reason, url)
+ #errfunc(1, errstring.reason)
+ #pass
+
+ except IOError, e:
+ if hasattr(e, 'reason'):
+ log.err("%s\n" % (e.reason))
+ if hasattr(e, 'code') and hasattr(e, 'reason'):
+ errfunc(e.code, e.reason, file)
+
+#TODO: walk_tree_copy_debs - DEPRECATED
+# This might require simplification and optimization.
+# But for now it's doing the job.
+# Need to find a better algorithm, maybe os.walk()
+def walk_tree_copy_debs(cache, sFile, sSourceDir):
+ '''
+ This function checks for a package to see if its already downloaded
+ It can search directories with depths.
+ '''
+ #The core algorithm is here for the whole program to function'\n'
+ #It recursively searches a tree/subtree of folders for package files'\n'
+ #like the directory structure of "apt-proxy". If files are found (.deb || .rpm)'\n'
+ #it checks wether they are on the list of packages to be fetched. If yes,'\n\
+ #it copies them. Same goes for flat "apt archives folders" also.'\n'
+ #Else it fetches the package from the net"""
+ bFound = False
+ try:
+ if cache is not None:
+ for name in os.listdir(cache) and bFound == True:
+ #if bFound == True:
+ # break
+ path = os.path.join(cache, name)
+ if os.path.isdir(path):
+ walk_tree_copy_debs(path, sFile, sSourceDir)
+ #walk_tree_copy_debs(path, sFile)
+ elif name.endswith('.deb') or name.endswith('.rpm'):
+ if name == sFile:
+ try:
+ shutil.copy(path, sSourceDir)
+ except IOError, (errno, errstring):
+ errfunc(errno, errstring)
+ except shutil.Error:
+ log.msg("%s is available in %s. Skipping Copy!\n" % (name, sSourceDir))
+ bFound = True
+ break
+
+ #shutil.copy(path, sSourceDir)
+ #bFound = True
+ #break
+ #return bFound
+ #return False
+ except OSError, (errno, strerror):
+ log.err("%s %s\n" % (errno, strerror))
+ errfunc(errno, strerror)
+
+
+def files(root):
+ for path, folders, files in os.walk(root):
+ for file in files:
+ yield path, file
+
+def copy_first_match(cache_dir, filename, dest_dir, checksum): # aka new_walk_tree_copy()
+ '''Walks into "reposiotry" looking for "filename".
+ If found, copies it to "dest_dir" but first verifies their md5 "checksum".'''
+
+ # If the repository is not given, we'll return None because the user wants to download
+ # it from the web
+ # There's no need to walk also because the user knows that he doesn't have any cache_dir
+ # Earlier implementation of having a default dir (os.curdir()) hit performance badly because
+ # at times it would start the walk from "C:\" or "/"
+ if cache_dir is None:
+ return False
+
+ for path, file in files(cache_dir):
+ if file == filename:
+ #INFO: md5check is compulsory here
+ # There's no point in checking for the disable-md5 option because
+ # copying a damaged file is of no use
+ if pypt_md5_check.md5_check(file, checksum, path) == True:
+ try:
+ shutil.copy(os.path.join(path, file), dest_dir)
+ except shutil.Error:
+ log.msg("%s available. Skipping Copy!\n\n" % (file, dest_dir))
+ return True
+ return False
+
+def stripper(item):
+ '''Strips extra characters from "item".
+ Breaks "item" into:
+ url - The URL
+ file - The actual package file
+ size - The file size
+ md5_text - The md5 checksum test
+ and returns them.'''
+
+ #INFO: This is obsolete
+ #lSplitData = each_single_item.split(' ') # Split on the basis of ' ' i.e. space
+ # We initialize the variables "sUrl" and "sFile" here.
+ # We also strip the single quote character "'" to get the real data
+ #sUrl = string.rstrip(string.lstrip(''.join(lSplitData[0]), chars="'"), chars="'")
+ #sFile = string.rstrip(string.lstrip(''.join(lSplitData[1]), chars="'"), chars="'")
+
+ item = item.split(' ')
+ url = string.rstrip(string.lstrip(''.join(item[0]), chars="'"), chars="'")
+ file = string.rstrip(string.lstrip(''.join(item[1]), chars="'"), chars="'")
+ size = string.rstrip(string.lstrip(''.join(item[2]), chars = "'"), chars="'")
+ #INFO: md5 ends up having '\n' with it.
+ # That needs to be stripped too.
+ md5_text = string.rstrip(string.lstrip(''.join(item[3]), chars = "'"), chars = "'")
+ md5_text = string.rstrip(md5_text, chars = "\n")
+
+ return url, file, size, md5_text
+
+
+def errfunc(errno, errormsg, filename):
+ '''
+ We use errfunc to handler errors.
+ There are some error codes (-3 and 13 as of now)
+ which are temporary codes, they happen when there
+ is a temporary resolution failure, for example.
+ For such situations, we can't abort because the
+ uri file might have other hosts also, which might
+ be well accessible.
+ This function does the job of behaving accordingly
+ as per the error codes.
+ '''
+
+ if errno == -3 or errno == 13:
+ #TODO: Find out what these error codes are for
+ # and better document them the next time you find it out.
+ # 13 is for "Permission Denied" when you don't have privileges to access the destination
+ pass
+ elif errno == 407 or errno == 2:
+ # These, I believe are from OSError/IOError exception.
+ # I'll document it as soon as I confirm it.
+ log.err("%s\n" % (errormsg))
+ sys.exit(errno)
+ elif errno == 504 or errno == 404 or errno == 10060:
+ #TODO: Counter which will inform that some packages weren't fetched.
+ # A counter needs to be implemented which will at the end inform the list of sources which
+ # failed to be downloaded with the above codes.
+
+ # 504 is for gateway timeout
+ # On gateway timeouts we can keep trying out becuase
+ # one apt source.list might have different hosts.
+ # 404 is for URL error. Page not found.
+ # THere can be instances where one source is changed but the rest are working.
+ # 10060 is for Operation Time out. There can be multiple reasons for this timeout
+ # Primarily if the host is down or a slow network or abruption, hence not the whole execution should be aborted
+ log.err("%s - %s - %s\n" % (filename, errno, errormsg))
+ log.verbose(" Will still try with other package uris\n\n")
+ pass
+ elif errno == 1:
+ # We'll pass error code 1 where ever we want to gracefully exit
+ log.err(errormsg)
+ log.err("Explicit program termination %s\n" % (errno))
+ sys.exit(errno)
+ else:
+ log.err("Aieee! I don't understand this errorcode\n" % (errno))
+ sys.exit(errno)
+
+def fetcher(url_file, download_dir, cache_dir, zip_bool, zip_type_file, arg_type = 0):
+ '''
+ uri - The uri data whill will contain the information
+ path - The path (if any) where the download needs to be done
+ cache - The cache (if any) where we should check before downloading from the net
+ arg_type - arg_type is basically used to identify wether it's a update download or upgrade download
+ '''
+
+ if arg_type == 1:
+ #INFO: Oh! We're only downloading the update package list database
+ # Package Update database changes almost daily in Debian.
+ # This is at least true for Sid. Hence it doesn't make sense to copy
+ # update packages' database from a cache.
+
+ if download_dir is None:
+ if os.access("pypt-downloads", os.W_OK) is True:
+ download_dir = os.path.abspath("pypt-downloads")
+ else:
+ try:
+ os.umask(0002)
+ os.mkdir("pypt-downloads")
+ download_dir = os.path.abspath("pypt-downloads")
+ except:
+ log.err("Aieeee! I couldn't create a directory")
+ errfunc(1, '')
+ else:
+ download_dir = os.path.abspath(download_dir)
+
+ if os.access(os.path.join(download_dir, zip_type_file), os.F_OK):
+ log.err("%s already present.\nRemove it first.\n" % (zip_type_file))
+ sys.exit(1)
+
+ try:
+ raw_data_list = open(url_file, 'r').readlines()
+ except IOError, (errno, strerror):
+ log.err("%s %s\n" % (errno, strerror))
+ errfunc(errno, '')
+
+ #INFO: Mac OS is having issues with Python Threading.
+ # Use the conventional model for Mac OS
+ if sys.platform == 'darwin':
+ log.verbose("Running on Mac OS. Python doesn't have proper support for Threads on Mac OS X.\n")
+ log.verbose("Running in the conventional non-threaded way.\n")
+ for each_single_item in raw_data_list:
+ (url, file, download_size, checksum) = stripper(each_single_item)
+ if download_from_web(url, file, download_dir, None) != True:
+ pypt_variables.errlist.append(file)
+ else:
+ if zip_bool:
+ compress_the_file(zip_type_file, file, download_dir)
+ os.unlink(os.path.join(download_dir, file)) # Remove it because we don't need the file once it is zipped.
+ else:
+ #INFO: Thread Support
+ if pypt_variables.options.num_of_threads > 1:
+ log.msg("WARNING: Threads is still in alpha stage. It's better to use just a single thread at the moment.\n")
+ log.warn("Threads is still in alpha stage. It's better to use just a single thread at the moment.\n")
+
+ NUMTHREADS = pypt_variables.options.num_of_threads
+ ziplock = threading.Lock()
+
+ def run(request, response, func=download_from_web):
+ '''Get items from the request Queue, process them
+ with func(), put the results along with the
+ Thread's name into the response Queue.
+
+ Stop running once an item is None.'''
+
+ while 1:
+ item = request.get()
+ if item is None:
+ break
+ (url, file, download_size, checksum) = stripper(item)
+ thread_name = threading.currentThread().getName()
+ response.put((thread_name, url, file, func(url, file, download_dir, None, NUMTHREADS, thread_name)))
+
+ # This will take care of making sure that if downloaded, they are zipped
+ (thread_name, url, file, exit_status) = responseQueue.get()
+ if exit_status == True:
+ if zip_bool:
+ ziplock.acquire()
+ try:
+ compress_the_file(zip_type_file, file, download_dir)
+ os.unlink(os.path.join(download_dir, file)) # Remove it because we don't need the file once it is zipped.
+ finally:
+ ziplock.release()
+ else:
+ pypt_variables.errlist.append(file)
+ #pass
+
+ # Create two Queues for the requests and responses
+ requestQueue = Queue.Queue()
+ responseQueue = Queue.Queue()
+
+ # Pool of NUMTHREADS Threads that run run().
+ thread_pool = [
+ threading.Thread(
+ target=run,
+ args=(requestQueue, responseQueue)
+ )
+ for i in range(NUMTHREADS)
+ ]
+
+ # Start the threads.
+ for t in thread_pool: t.start()
+
+ # Queue up the requests.
+ for item in raw_data_list: requestQueue.put(item)
+
+ # Shut down the threads after all requests end.
+ # (Put one None "sentinel" for each thread.)
+ for t in thread_pool: requestQueue.put(None)
+
+ # Don't end the program prematurely.
+ #
+ # (Note that because Queue.get() is blocking by
+ # defualt this isn't strictly necessary. But if
+ # you were, say, handling responses in another
+ # thread, you'd want something like this in your
+ # main thread.)
+ for t in thread_pool: t.join()
+
+ if arg_type == 2:
+ if download_dir is None:
+ if os.access("pypt-downloads", os.W_OK) is True:
+ download_dir = os.path.abspath("pypt-downloads")
+ else:
+ try:
+ os.umask(0002)
+ os.mkdir("pypt-downloads")
+ download_dir = os.path.abspath("pypt-downloads")
+ except:
+ log.err("Aieeee! I couldn't create a directory")
+ else:
+ download_dir = os.path.abspath(download_dir)
+
+ if os.access(os.path.join(download_dir, zip_type_file), os.F_OK):
+ log.err("%s already present.\nRemove it first.\n" % (zip_type_file))
+ sys.exit(1)
+
+ if cache_dir is not None:
+ cache_dir = os.path.abspath(cache_dir)
+
+ try:
+ raw_data_list = open(url_file, 'r').readlines()
+ except IOError, (errno, strerror):
+ log.err("%s %s\n" %(errno, strerror))
+ errfunc(errno, '')
+
+ #INFO: Mac OS X in mis-behaving with Python Threading
+ # Use the conventional model for Mac OS X
+ if sys.platform == 'darwin':
+ log.verbose("Running on Mac OS. Python doesn't have proper support for Threads on Mac OS X.\n")
+ log.verbose("Running in the conventional non-threaded way.\n")
+ for each_single_item in raw_data_list:
+ (url, file, download_size, checksum) = stripper(each_single_item)
+
+ if cache_dir is None:
+ if download_from_web(url, file, download_dir, checksum) != True:
+ pypt_variables.errlist.append(file)
+ if zip_bool:
+ compress_the_file(zip_type_file, file, download_dir)
+ os.unlink(os.path.join(download_dir, file))
+ else:
+ if copy_first_match(cache_dir, file, download_dir, checksum) == False:
+ if download_from_web(url, file, download_dir, checksum) != True:
+ pypt_variables.errlist.append(file)
+ else:
+ if os.access(os.path.join(cache_dir, file), os.F_OK):
+ log.debug("%s file is already present in cache-dir %s. Skipping copy.\n" % (file, cache_dir)) #INFO: The file is already there.
+ log.verbose("%s file is already present in cache-dir %s. Skipping copy.\n" % (file, cache_dir))
+ else:
+ if os.access(cache_dir, os.W_OK):
+ shutil.copy(file, cache_dir)
+ log.verbose("%s copied to %s\n" % (file, cache_dir))
+ else:
+ log.verbose("Cannot copy %s to %s. Is %s writeable??\n" % (file, cache_dir))
+
+ if zip_bool:
+ compress_the_file(zip_type_file, file, download_dir)
+ os.unlink(os.path.join(download_dir, file))
+ elif True:
+ if zip_bool:
+ compress_the_file(zip_type_file, file, download_dir)
+ os.unlink(os.path.join(download_dir, file))
+ else:
+ #INFO: Thread Support
+ if pypt_variables.options.num_of_threads > 1:
+ log.msg("WARNING: Threads is still in alpha stage. It's better to use just a single thread at the moment.\n")
+ log.warn("Threads is still in alpha stage. It's better to use just a single thread at the moment.\n")
+
+ NUMTHREADS = pypt_variables.options.num_of_threads
+ ziplock = threading.Lock()
+
+ def run(request, response, func=copy_first_match):
+ '''Get items from the request Queue, process them
+ with func(), put the results along with the
+ Thread's name into the response Queue.
+
+ Stop running once an item is None.'''
+
+ while 1:
+ item = request.get()
+ if item is None:
+ break
+ (url, file, download_size, checksum) = stripper(item)
+ thread_name = threading.currentThread().getName()
+ response.put((thread_name, url, file, func(cache_dir, file, download_dir, checksum)))
+
+ # This will take care of making sure that if downloaded, they are zipped
+ (thread_name, url, file, exit_status) = responseQueue.get()
+ if exit_status == True:
+ log.msg("%s copied from cache.\n" % (file))
+ log.verbose("%s copied from cache-dir %s.\n" % (file, cache_dir))
+ log.debug("%s copied from cache-dir %s.\n" % (file, cache_dir))
+ else:
+ log.debug("%s not available in local cache %s\n" % (file, cache_dir))
+ log.verbose("%s not available in local cache %s\n" % (file, cache_dir))
+ exit_status = download_from_web(url, file, download_dir, checksum, NUMTHREADS, thread_name)
+
+ if exit_status:
+
+ #INFO: copy to cache-dir for further use
+ # Here we try copying the downloaded file to the cache-dir
+ # so that if the same file is asked for again, it can be copied from the local storage device
+ if cache_dir is None:
+ log.debug("No cache-dir specified. Skipping copy.\n")
+ elif os.access(os.path.join(cache_dir, file), os.F_OK):
+ log.debug("%s is already present in %s.\n" % (file, cache_dir))
+ else:
+ if os.access(cache_dir, os.W_OK):
+ shutil.copy(file, cache_dir)
+ log.debug("%s copied to local cache-dir %s.\n" % (file, cache_dir))
+ log.verbose("%s copied to local cache-dir %s.\n" % (file, cache_dir))
+
+ if zip_bool:
+ ziplock.acquire()
+ try:
+ compress_the_file(zip_type_file, file, download_dir)
+ os.unlink(os.path.join(download_dir, file)) # Remove it because we don't need the file once it is zipped.
+ finally:
+ ziplock.release()
+ else:
+ pypt_variables.errlist.append(file)
+
+ # Create two Queues for the requests and responses
+ requestQueue = Queue.Queue()
+ responseQueue = Queue.Queue()
+
+
+ # Pool of NUMTHREADS Threads that run run().
+ thread_pool = [
+ threading.Thread(
+ target=run,
+ args=(requestQueue, responseQueue)
+ )
+ for i in range(NUMTHREADS)
+ ]
+
+ # Start the threads.
+ for t in thread_pool: t.start()
+
+ # Queue up the requests.
+ for item in raw_data_list: requestQueue.put(item)
+
+ # Shut down the threads after all requests end.
+ # (Put one None "sentinel" for each thread.)
+ for t in thread_pool: requestQueue.put(None)
+
+ # Don't end the program prematurely.
+ #
+ # (Note that because Queue.get() is blocking by
+ # default this isn't strictly necessary. But if
+ # you were, say, handling responses in another
+ # thread, you'd want something like this in your
+ # main thread.)
+ for t in thread_pool: t.join()
+
+ # Print the failed files
+ if len(pypt_variables.errlist) == 0:
+ pass # Don't print if nothing failed.
+ else:
+ log.err("The following files failed to be downloaded.\n")
+ for error in pypt_variables.errlist:
+ log.err("%s failed.\n" % (error))
+
+def syncer(install_file_path, target_path, arg_type=None):
+ '''Syncer does the work of syncing the downloaded files.
+ It syncs "install_file_path" which could be a valid file path
+ or a zip archive to "target_path'''
+
+ if arg_type == 1:
+ try:
+ import zipfile
+ except ImportError:
+ log.err("Aieeee! Module zipfile not found.\n")
+ sys.exit(1)
+
+ try:
+ import pypt_magic
+ except ImportError:
+ log.err("Aieeee! Module pypt_magic not found.\n")
+ sys.exit(1)
+
+ file = zipfile.ZipFile(install_file_path, "r")
+ for filename in file.namelist():
+
+ data = open(filename, "wb")
+ data.write(file.read(filename))
+ data.close()
+
+ #FIXME: Fix this tempfile feature
+ # Access to the temporary file is not being allowed
+ # It's throwing a Permission denied exception
+ #try:
+ # import tempfile
+ #except ImportError:
+ # sys.stderr.write("Aieeee! Module pypt_magic not found.\n")
+ # sys.exit(1)
+ #data = tempfile.NamedTemporaryFile('wb', -1, '', '', os.curdir)
+ #data.write(file.read(filename))
+ #data = file.read(filename)
+
+ if pypt_magic.file(os.path.abspath(filename)) == "application/x-bzip2":
+ decompress_the_file(os.path.abspath(filename), target_path, filename, 1)
+ elif pypt_magic.file(os.path.abspath(filename)) == "application/x-gzip":
+ decompress_the_file(os.path.abspath(filename), target_path, filename, 2)
+ elif pypt_magic.file(filename) == "PGP armored data" or pypt_magic.file(filename) == "application/x-dpkg":
+ if os.access(...
[truncated message content] |