Matlab-Machine is a comprehensive collection of machine learning algorithms implemented in MATLAB. It includes both basic and advanced techniques for classification, regression, clustering, and dimensionality reduction. Designed for educational and research purposes, the repository provides clear implementations that help users understand core ML concepts.
This site contains four packages of Mass and mass-based density estimation.
1. The first package is about the basic mass estimation (including one-dimensional mass estimation and Half-Space Tree based multi-dimensional mass estimation). This packages contains the necessary codes to run on MATLAB.
2. The second package includes source and object files of DEMass-DBSCAN to be used with the WEKA system.
3. The third package DEMassBayes includes the source and object files of a Bayesian classifier using DEMass. ...
Simple .m files, Basic Neural Networks study for Octave (or Matlab)
--> For a more detailed description check the README text under the 'Files' menu option :)
The project consists of a few very simple .m files for a Basic
Neural Networks study under Octave (or Matlab).
The idea is to provide a context for beginners that will allow to
develop neural networks, while at the same time get to see and feel
the behavior of a basic neural networks' functioning.
The code is completely open to be modified and may suit several scenarios.
The code commenting is verbose, and variables and functions do respect
English formatting, so that code may be self explanatory.
...
Full-stack observability with actually useful AI | Grafana Cloud
Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.
Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
Bayesian Surprise Matlab toolkit is a basic toolkit for computing Bayesian surprise values given a large set of input samples. It is also useful as way of exploring surprise theory. For more information see also: http://ilab.usc.edu/