Deploy in 115+ regions with the modern database for every enterprise.
MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.
Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
Irudiko is a library written in C++ for generating Locality Sensitive Hashing sketches from any textual and web document. Mainly designed to work with HTML pages, it has also an optimization support for English or Italian documents.
This project intends to create an indexing search engine, for knowledge management. The primary object is to apply an information retrieval core. And implement a knowledge data discovery theory such as data mining algorithm, text mining.
Analysis and interactive visualization of a web-based community. Supports different focuses on the given social network to present community groups to the user. Also specific information of each member is provided.
Stop waiting on engineering. Build production-ready internal tools with AI—on your company data, in your cloud.
Retool lets you generate dashboards, admin panels, and workflows directly on your data. Type something like “Build me a revenue dashboard on my Stripe data” and get a working app with security, permissions, and compliance built in from day one. Whether on our cloud or self-hosted, create the internal software your team needs without compromising enterprise standards or control.
A configurable knowledge management framework. It works out of the box, but it's meant mainly as a framework to build complex information retrieval and analysis systems. The 3 major components: Crawler, Analyzer and Indexer can also be used separately.
phpTrafMon is a set of scripts written in php. It shows in an attractive and user-friendly way the traffic in a local network and a share in it of every user. phpTrafMon requires MySQL, crontab and a popular IPFM program.
Image2DocInfo has been made to quickly tag digital pictures. A GUI allows you to set attributes for an image, and then store them in XML files. Those files follow the Dublin Core naming scheme and are stored in the same directories than the pictures.
lease-parser is a simple daemon that records the lease state changes of an ISC
DHCP server to a database for historical reference. The data can be searched
via a web search form that is provided with the tool.
JavaMatch is an engine that can search inside a runtime Java data structures, and look for objects that best match the criteria that you specify. The extensive query mechanism allows for highly customizable tuning of your match queries.
The Internet Censor is a multi-platform, Internet clustering program, for which the resulting data will be used in the creation of a non-profit content-filtering Internet Search Engine for children.
HORUS is a system for knowledge acquisition, hypothesis generation, inference and learning. It is an interactive, internet environment accessible to a diverse community of users (public-access or membership basis) - see also UMKAILASH project for more.
This project is a Python-based HTTP web proxy server that hooks into MySQL to store a full history of your browsing. Allows you to check out statistics about your browsing habits. Creates a personal portal page, has search features, multi-user, filters.
PyEsp - Enhanced/Evolving/Extensible Semantic Profiling.
This Python program will sort and filter search results by applying semantic profiling on web pages. The program will learn the user preferences and profiling will be done on the client computer.
studiMaps is a web based application for visualization and analysis of social networks. It consists of two software components: a web-crawler for getting data and the web based application for visualization.