Full-stack observability with actually useful AI | Grafana Cloud
Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.
Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
Create free account
Go from Code to Production URL in Seconds
Cloud Run deploys apps in any language instantly. Scales to zero. Pay only when code runs.
Skip the Kubernetes configs. Cloud Run handles HTTPS, scaling, and infrastructure automatically. Two million requests free per month.
LogDistiller is a logfile merge and sort tool. Log content is classified according to rules configured in an XML file. Classification results go into reports, which are published : simply stored in a file, sent by mail, or even added to a news feed.
This project is devoted to analyze the log lines from the Wikipedia Squid servers by parsing and filtering their information elements according to the directives specified in an XML file. Resulting info is stored in a MySql db for further analysis.
Data mining tool for sequences (e.g. trajectories on a map, visited web pages, etc.) that creates a succinct description of the sequences, given a taxonomy (e.g. regions and sub-regions in the map, categories and sub-categories of pages, etc.).
JNFA - is a netflow analyzer. It uses MySQL database to store accounting information. Filters, used in the JNFA, allows very flexible classificate any kind of traffic and store it in the differend fields in database.
Like Unix-Tail BUT:
- Runs with or without GUI
- Suspend and resume tailing at runtime
- Can monitor a set of Files
- Print output to a textfield, stdout or file
- Runs in "Grep" mode, too (Read files once)
- (Almost) the same options as Unix-Tail
LogCrawler is an ANT task for automatic testing of web applications. Using a HTTP crawler it visits all pages of a website and checks the server logfiles for errors. Use it as a "smoketest" with your CI system like CruiseControl.
Commandline tool that can view multiple log files remotely and blend them into a single output based on the appropriate dates within the log files themselves. Has cat and tail modes available. Support for multiple local/remote protocols
Stop waiting on engineering. Build production-ready internal tools with AI—on your company data, in your cloud.
Retool lets you generate dashboards, admin panels, and workflows directly on your data. Type something like “Build me a revenue dashboard on my Stripe data” and get a working app with security, permissions, and compliance built in from day one. Whether on our cloud or self-hosted, create the internal software your team needs without compromising enterprise standards or control.
Peepo is a tool for remote analysis of Apache weblogs in real time. It consists of a server that broadcasts Apache logs via UDP and a desktop application that receives, filters and plots them.
Qualiweb aims at providing semantic web metrics for modeling a website visitors needs according to a given taxonomy or document classification. Web metrics provided by Qualiweb give an indication of how successful each of the website topics have been.
jwebstats is an application to generate web site statistics and reports based upon Apache Web Server log files. The application is run from the commandline and generates HTML reports.
Web log analyzer... Written in Java, aims to provides the usual host / page analysis. Can also do site graphing using graphviz, browser, os, worm and search engine identification, and country and session tracking.
Web analyzer for logs from different formats, which output XML reports, multi-hosts logs file supported, possibility to apply an XSL page to ouput in HTML, and use of SVG to make the graphs.
The project includes the library to parse HTTP_USER_AGENT
Platform based on JMS queues to centralize System Event Messages (syslog) and Application Layer Messages, persisting them in several Databases and providing several pre- and post- processes according the message's nature, like encryption, mail.
This is a program for my sister that will run on the commandline and scrape ISBNs off a website, cross reference w/ amazon, and make a CSV out of the data.