Atomic OS is a responsive Web 2.0 operating environment & development platform. Based on AJAX techniques, it emulates/provides standard operating system features including a command-line shell, interpreter, filesystem, database access and GUI services.
MmUD is a client designed to work with muds that have formatted text outputs.
This is an effort to fully recreate the Hitchhiker's Guide to the Galaxy text adventure game in the Perl scripting language.
Perl Web Scraping Project
Web scraping (web harvesting or web data extraction) is data scraping used for extracting data from websites. Web scraping software may access the World Wide Web directly using the Hypertext Transfer Protocol, or through a web browser. While web scraping can be done manually by a software user, the term typically refers to automated processes implemented using a bot or web crawler. It is a form of copying, in which specific data is gathered and copied from the web, typically into a central local database or spreadsheet, for later retrieval or analysis. Web scraping a web page involves fetching it and extracting from it. Fetching is the downloading of a page (which a browser does when you view the page). Therefore, web crawling is a main component of web scraping, to fetch pages for later processing. Once fetched, then extraction can take place. The content of a page may be parsed, searched, reformatted, its data copied into a spreadsheet, and so on.
AI-TRIP finds an optimal route between a given start and destination through avaliable mass transit systems, particularly train and bus. AI-TRIP can plan routes involving more than one transit system, and can find optimal routes based on time or cost.
Monitor VirtualHosts and streamline deployment to your Apache Server using Apache App Manager. Deploy static or dynamic (e.g. PHP) websites using zip files, rather than FTP'ing single files. Features: Upload, Deploy, Start/Stop, and Monitor sites.
Enrichment analysis for customized organisms
FWebSpider is a web crawler application written on Perl. It performs chosen site crawl, featuring response cache, URL storage, URL exclusion rules and more. It is developed to function as a local/global site search engine core.
The Kachina project seeks to build a mobile-commerce system for the iPhone based on 1-dimensional barcode scanning.
Admin, Backup, (Mirror) Cache, (re)Deploy, Embed (wizard), Migrate
"ABCD (th)em" referring to computers that need to be setup according to developed best-practice scripts and procedures. No Liability taken for any parts of this project; you're supposed to make your own recipes no matter how perfected our demo has become. This is a Deployment, Lockdown, Automation, and Management Framework for better install and automation scripting powers. Demo needs 3+ pcs in the following destructive spawn order to show off; Start with a CentOS-Net-Install (manual param entry) or ABCDem-Embedded-Install ISO to provide Distros, the local deployment mastermind which will provide Net-Boot, Spin-Builder, and Management. 0) Distros - Cache & All-In-One-Server. 1) Desktop, Media, EMail, & Browser. 2) Ultra Basic Workstation & Gaming. 3) Random but unchanging on next rebuild. *) Computers 1-3 will not confuse build roles or data; keeping original birth order and migrating data from backups. Computer Zero can make rebuild&migrate self-contained auto ISO
Develop the offline and online management
Maximal Clique Motif Reduction (MCMR) is a software program for running and then combining the output of multiple motif finder programs, such as MEME, AlignACE and Weeder, into a set of consensus predictions with associated confidence rankings.
Netradio plays and records from internet radio streams (using Realplayer or an MP3 player) or local files. It can be used with 'at' or 'cron' record radio programmes when the user is not logged in using the computer.
REX: Remote EXecution Distributed Computing Services for Linux and Solaris, providing C and C++ APIs, librex library and "rexd" daemon software to implement Load Balancing Process Migration : Dump + Restore, Remote File and Resource Management.
Second Symthium Server
Code for reference implementations of identity brokers and simple single sign-on (SSO) mechanisms that utilize XDI and link contracts to manage the dataweb.
An open-source data management, analysis and visualization system to make the process of the scientific data development clean, easily reproducible, and easily sharable with the outside world.
This project tries to define rules of music library hierarchy like file names, directory hierarchy, data formats and more.
Transducer is a lightweight pure Perl web server and template system that turns your machine into an interactive web site. Transducer only uses the standard modules included with Perl version 5.8.8 or higher. No additional modules or compiling required!
Unimatrix is the collective works of Matrix, Tsunami and Mainframe. Unimatrix allows for a setup of distributed file repositories. Completely peerless, and using HTTP/1.0 as its transfer medium. Unimatrix Can easilly be used for Filesharing and since its
Support for the Oasis XRI (Extensible Resource Identifiers) effort. This includes resolvers and client libraries for XRIs in multiple languages and multiple platforms. See http://www.oasis-open.org/committees/xri
Project, studies, OpenGL, Jumping
enrichment analysis for customized organisms
Do Gene set enrichment analysis for any organisms.
Simple analysis of historical financial data
I've been noticing for a while that the graphs of "growth for the last 10 years" in the mutual fund prospectuses depend a lot on the starting point: if the starting point was high (i.e. after previous growth), the following growth would look poor compared to the competition, and the other way around. So I wrote some simple scripts to try to compensate for this effect.
Perl implementation of Financical Information Exchange protocol parser and encoder