Deploy in 115+ regions with the modern database for every enterprise.
MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Start Free
Train ML Models With SQL You Already Know
BigQuery automates data prep, analysis, and predictions with built-in AI assistance.
Build and deploy ML models using familiar SQL. Automate data prep with built-in Gemini. Query 1 TB and store 10 GB free monthly.
A PHP search engine for your website and web analytics tool. GNU GPL3
ahCrawler is a set to implement your own search on your website and an analyzer for your web content. It can be used on a shared hosting.
It consists of
* crawler (spider) and indexer
* search for your website(s)
* search statistics
* website analyzer (http header, short titles and keywords, linkchecker, ...)
You need to install it on your own server. So all crawled data stay in your environment.
You never know when an external webspider updated your content. Trigger a rescan...
eZimDMS based on DocMgr is a Document Management System(DMS). Which provide more commerical features, such as upload progress bar, enchanced security, more and more. Our goals would like to provide a full & complete document management system
Allows teams to register their athletes for a track meet. Features include web based administration, different divisions, unlimited events, printable screens, security features, and more. It can seed the athletes in each event for any number of divisions
The Netjuke is a Web-Based Audio Streaming Jukebox powered by PHP 4, a database and all the MP3, Ogg Vorbis and other format files that constitute your digital music collection. Supports images, language packs, multi-level security, random playlists, etc
OpenAnonymity consists of a module for apache 2.0 Webserver and a framework that enables you to control search engine spider indexing on a word level, contrary to on file level as in Robots exclusion. OA could force Spiders to follow this rules.