Put idle assets to work with competitive interest rates, borrow without selling, and trade with precision. All in one platform.
Geographic restrictions, eligibility, and terms apply.
Get started with Nexo.
Custom VMs From 1 to 96 vCPUs With 99.95% Uptime
General-purpose, compute-optimized, or GPU/TPU-accelerated. Built to your exact specs.
Live migration and automatic failover keep workloads online through maintenance. One free e2-micro VM every month.
Smart Cache Loader is a very configurable pure Java web grabber with special support for integration with Smart Cache proxy server. It can perform different loading operations based on URL mask, content-type, ...
Full SEO (Search Engine Optimization) app for Android.
...It certainly replaces the tens of different application you use for SEO by bundling all of them into one compact app.
A must have tool for all Bloggers, Website Owners, Marketers and Android Web lovers.
With PubSearch you can search for publications of an author in more publication databases at one time. PubSearch can crawl also "cited by" publications transitively for you! You can export publications to citation formats. You can add your own format templates and publication database definitions.
See the link below for more details.
Der Dirigent ist ein einfach zu handhabendes Content Management System auf basis von PHP und MySQL.
"Der-Dirigent" is an easy to use Content Management System based on PHP and MySQL
=DOES NOT WORK ANYMORE AS DSA HAS PUT CAPTCHA= DSA Practical Driving Test Monitor helps you find any available practical driving test slot within specified date range. Runs on Linux/Mac/Windows and automates your manual task of finding the test slot.
WebWatcher - a Web-page Update Monitor This program will help you keep an eye on interesting Web-pages. You register a list of URLs you want to monitor, and WebWatcher checks for changes whenever you ask it to, or at given intervals. WebWatcher bases
This is an ***old archive*** of tools developed for facilitating the use of Creative Commons licenses and metadata. --- For the most up to date representation of any of the projects listed here, please see: http://creativecommons.org/project/Developer.
iVia is an Internet subject portal or virtual library system. As a hybrid expert and machine built collection creation and management system, resources can be crawled and metadata and selected full-text can be automatically generated/extracted.
Full-stack observability with actually useful AI | Grafana Cloud
Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.
Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
WebNews Crawler is a specific web crawler (spider, fetcher) designed to acquire and clean news articles from RSS and HTML pages. It can do a site specific extraction to extract the actual news content only, filtering out the advertising and other cruft.
Aracnis is a Java based framework for building distributed web spiders. These spiders can be used to accomplish a variety of tasks, for example, screen-scraping and link integrity checking.
This forum software is a Java based discussion forum, that uses JDBC to store data in a database. This discussion forum is available in different languages and has features for easy integration into a site and easy administration of forum.
webspider provides a mechanism to get contents from web. With the extended classes, you can do the following things:
1. grab urls from a specified base url
2. analyze the contents of a list of urls
3. get specific files from web
4. blablabla
QuickWCM is a Web Content Manager (WCM) with a very easy to use web-based interface, seamless security model, integrated search engine and more. QuickWCM runs on JSR-170 repository and is easy to extend with JSR-168 portlets.
JaWiki is Java Wiki with a file based database to manage the Content.
The content is stored in XML files in the file system.
A html frontend allows to edit the content by the users via an Browser.
A standalone server also included.
JLinkCheck is an Ant Task written in Java for checking links in websites. It is not just checking one single page, but crawling a whole site like a spider, generating a report in XML and (X)HTML. JReptator will be its succesor with many more features
Toke is a webmining toolkit for web exploring, indexing and searching for Java. Toke allows to you crawl public or private web sites, in order to create web estatistics, web Pajek graphs, Lucene indexs and word frequency files for data clustering.
Catalogo is a system for cataloguing resources on a website. It allows semantic search of information on an intranet using metadata, RDF and ontology concepts. It provides a Catalog server (Java web applications) and a Catalog client (Firefox plug-in).
Sperowider Website Archiving Suite is a set of Java applications, the primary purpose of which is to spider dynamic websites, and to create static distributable archives with a full text search index usable by an associated Java applet.
SmartCrawler is a java-based fully configurable, multi-threaded and extensible crawler, which is able to fetch and analyze the contents of a website by using dinamically pluggable filters
Robust featureful multi-threaded CLI web spider using apache commons httpclient v3.0 written in java. ASpider downloads any files matching your given mime-types from a website. Tries to reg.exp. match emails by default, logging all results using log4j.
HouseSpider is a Java applet that adds search capability to your website. It can search by two methods, by spidering through your site or by searching a cached index file. It has 100% i18n (internationalization) support.
DialogSearch is an experimental approach to website searching, which uses the similarity between web pages to retrieve them. It is an alternative to hyperlink-based algorithms such as PageRank and HITS. BEWARE: This is only an experimental prototype.
i-Tor is a set of Tools and Technologies for Open Repositories, based on Linux, Java, MySQL, Mirage and other free components. It harvests OAI and turns databases into Open Archives. It includes similarity, backlinks and related search based on Lucene.
The application will be able to provide further information about the location of a host by analyzing the senders IP address. It works like other localizer software and provides different types of visualisation (map, text).