Exploration and categorization of CREs and CRMs
Cis-regulatory elements (CREs) and cis-regulatory modules (CRMs) play an important role in temporal and spatial regulation of gene expression, which is a common process in eukaryotic organisms. We developed two programs that serve as exploratory tools in the analysis of CRM-mediated control of gene expression: “Exploration of Distinctive CREs and CRMs” (EDCC) and “CRM Network Generator” (CNG). EDCC correlates the presence and positions of CREs/CRMs with gene expression data and identifies candidate regulatory elements for further functional analysis. CNG provides an unbiased neural network approach to assess the importance of positional features that were determined by EDCC. To sustain a high computational performance even for large datasets, the mostly in Python 3 written programs use k-mer based indexing, parallelization and a neural network approach for categorization. For further information please refer to the publication: doi.org/10.1371/journal.pone.0190421
Website2Pdf application helps to get offline form of webpages.
Favorite webpages can be made available offline as pdf files. Enter your favorite website url, with just one click pdf files will be created without loss of any css, styling of html. All the web files will be retained. Please make sure to use help button before you convert webpages to offline files.
Perl Web Scraping Project
Web scraping (web harvesting or web data extraction) is data scraping used for extracting data from websites. Web scraping software may access the World Wide Web directly using the Hypertext Transfer Protocol, or through a web browser. While web scraping can be done manually by a software user, the term typically refers to automated processes implemented using a bot or web crawler. It is a form of copying, in which specific data is gathered and copied from the web, typically into a central local database or spreadsheet, for later retrieval or analysis. Web scraping a web page involves fetching it and extracting from it. Fetching is the downloading of a page (which a browser does when you view the page). Therefore, web crawling is a main component of web scraping, to fetch pages for later processing. Once fetched, then extraction can take place. The content of a page may be parsed, searched, reformatted, its data copied into a spreadsheet, and so on.
Source code for perl simple text editor.
Simple text editor. Run: $perl 01text0.pl
Create perl/tk GUI for your programs.
Simple command to RUN: $perl ZooZ.pl, and go create gui.
Volocian believes first and foremost that financial status shouldn’t be a barrier to entry in multimedia content creation. Whether you want to design a product or website, record a band, remix a song, or produce a feature length movie with professional post production graphics and effects, Volocian™ wants to help, even if you’re using borrowed hardware with no previous experience. We provide affordable products and services for any budget, including cost-free solutions for education and demo purposes, while everything we do is designed with ease of use and flexibility in mind, to suit everyone from novice hobbyists to expert professionals. Wherever possible, Volocian™ relies on free, open source, and cross-platform software to prevent vendor lock-in, planned obsolescence, and software as a service licensing. While we remain pragmatic about the use of proprietary software, as we’re aware of the feature limitations of some open source solutions, free solutions are preferred.
Content Based File level Data Backup in Python
Content Based File level Data Backup in Python. This is a utility to backup your files. It can do full and incremental backups. It will take a directory as input, and will back up the files in that folder and all sub-folders to the backup destination directory. It can compress each file individually while backing-up. Mirrors the source directory structure under the target directory. Creates 1 archive file for each source file. Even if the backup/restore program or backup database is not available or lost, the backup files and structure can still be recovered. Huge plus for reliability. Uses a SQL database to manage backup file information. The files are managed through their checksum (SHA256). See wiki page for additional information: https://sourceforge.net/p/filebasedbackup/wiki/Home/ The project is work in progress. Main backup script is functional, but i have taken a break from adding features to the main program to build a GUI in Tkinter.
Unified Test and Logging layer for multiple programming languages
Modern software systems and application are commonly written in multiple languages, include scripting engines, and are frequently build on multiple specialized frameworks and middleware for a considerable diversity of runtime environments. The latest influencing update in development paradigm is the application of multicore processors. This projects is aimed to unify the required trace and logging output and integrate into debugging environments. The target is to provide general development, test, and production support of software environments based on multiple programming languages for distributed multicore environments.
Simple CAD program
SAMoCAD - its program to create simple drawings, partially supports DXF. Main features: - create simple objects (line, arc, circle) - create complex objects (text, dimensions) - edit drawn objects - save drawing to SVG format - output drawing in PostScript format - export/import content DXF files (primitives LINE, CIRCLE, ARC, TEXT, DIMENSION) Program written in Python 2 and use the library Tkinter. Program requires no installation, but you need install Python 2.7 or later on your PC. Run file SAMoCAD.pyw to run the program. Its very old version of program!!! For more information see
A document clustering system with search & report generation features
A university project - A document clustering software for an audit client with additional features. The main task of clustering takes documents in a directory as an input and outputs an Excel spreadsheet displaying clusters of documents, with each cluster containing documents that are similar to each other. The search features take search terms as input by the user and a directory with documents as an input and outputs an Excel spreadsheet displaying all documents containing the search term and gives similar documents to these. The 2nd feature gives each sentence containing the search term from documents found. The report generation feature specifically for use by audit companies takes an audit report as an input and outputs an insight log and draft management letter with insights pulled from the report. This feature can be customised to suit a company's requirements. This software works with pdf, docx, txt and csv files and the zip file must be saved in "My Documents".
Twitter client written in python.
A go or chess timer written in python. Go timer can be used instead of a stand alone go or chess timer when playing a game on a real life board.