Auto Index wap is Advance of Download Portal (Multi Language)
Djamolwap 13v -Advance Auto Index With Web Admin Panel + Multi Language + Themes ||||||||||||||||||||||||||||||||||||| New Updates ||||||||||||||||||||||||||||||||||||| - Multi Language Website 1) English 2) Urdu 3) Gujrati 4) Russian - User/Visitor manual change language website - Multi Language Plugin On/Off - Added Function in Admin Panel - Automatic All Mp3 Tag Setting Added _____________________________________________ Official Website : http://ai.djamol.com Demo Of Future & Installation Live http://youtube.com/phpindia ------------------------------------- [[ Requirements ]] Webserver (Cpanel OS OR Other OS) * PHP 5.0 or greater * MySQL 5.0 or greater # The mod_rewrite Apache module (.htaccess) (Note : mod_rewrite Apache if this future not avilalble then use DjAmolWap 10.4version without htaccess)
Amberfish is general purpose text retrieval software. It supports nested queries of semi-structured text in XML format and traditional unstructured searching.
This was a terrible idea and is equally terribly implemented.
HTTP Directory Index consiste en un script PHP que actúa como interfaz gráfica amigable para indexar directorios Web.
Simple application for downloading pictures from Zerochan.net
Simple java application for downloading high-quality pictures from Zerochan.net. You can find images by size or a tag. It's simple. And flat. All you need to do: download .jar file and run it with Oracle JVM (or any another JVM supporting image decoding)
ht://Check is more than a link checker. It's particularly suitable for checking broken links, anchors and web accessibility barriers, but retrieved data can also be used for Web structure mining. Uses a MySQL backend. Derived from ht://Dig.
ASPseek is a full-featured medium-to-large scale SQL-based Internet search engine. It consists of an indexing robot, search daemon and search frontend (CGI program). These programs are written in C++ using the STL library.
An Apache2 DSO module search engine based on the Swish-e C API returning results by replacing tags in a user supplied html template. Persons with Swish-e knowledge and ability to generate a Swish-e index file should find the searchm interface familiar.
The goals of this systems are following: 1. User can maintain his bookmarks. 2. User has private bookmark counter. 3. User can create categories of his own. 4. User can see his bookmarks without login. 5. User can see bookmarks of his friends. 6. Us
A search engine and crawler that provide plugin support for adding new ways of calculating relevancy.
HooDoo is designed to provide most of the same functionality of Google, but available to all for their websites
Command line HTML Parser to be used in scripts to extract data from HTML/webpage according to supplied path and options. Usefull for systematic periodic parsing pages with known structures where information keeps changing - like looking for item on ebay
My Community Portal is a all in one internet portal that offers, forum, groups, chat, your own e-mail, search engine, internet directory, your own home page, poll's, dating services, buddy list, MP3 and file sharing, and many more.
Nmap Log Stripper is a Bash script intended to be a way to condense all, or some, of the IPs of a "random" (-iR) nmap scan into a file for later usage.
OMax is set of projects including real estate crawler and management system.
Omseek has been renamed to Xapian. Xapian is a Search Engine Library, written in C++ with bindings for Perl, Python, PHP, Java, Tcl, C# and Ruby. It allows you to easily add advanced indexing and search facilities to your applications. See xapian.org
provides a fully featured Server-Side WEB Frontend for a famous DCTC ( Direct Connect Text Client ) for Linux/BSD/MacOS systems.
RAHoo is a PHP-based, self-documenting, easy to install, fully customizaable web application written in PHP using MySQL and a suitable web server. Use it to keep a directory structure of links similar to Yahoo or the Google search directories but focuse
An RDF-based post content information system with associated APIs, used to provide intelligent information about the status and accessibility of a web document. Functionality: page re-directs, intelligent 404 handling, Threadneedle connectivity and other
Sciense Searcher is a system that lets you search, organize and share bibliographic cites of research articles, books, booklets, collections, manuals, thesis, proceedings, technical reports, unpublished publications and misc.
The Somewhat Intelligent Proxy [SIP] is an effort at an open-source, natural language, web accessible instrument which utilizes Internet sources to return answers to your questions.
Sprawler is the first Open Source internet search engine software and service - built by the community, for the community. It will address the various reasons most search engines today still are far from being where they need to be.
Grub is a distributed internet crawler/indexer designed to run on multi-platform systems, interfacing with a central server/database.
JAMP provides several functions to index and manage your media files on resources like storage systems or dvds. The userinterface is webbased and fully written in java.