IRAMUTEQ : Interface de R pour les Analyses Multidimensionnelles de Textes et de Questionnaires. Logiciel de traitement de données pour des corpus texte ou de type individus/caractères. Permet notamment de réaliser des analyses de type "ALCESTE"
C++, Matlab and Python library for Hidden-state Conditional Random Fields. Implements 3 algorithms: LDCRF, HCRF and CRF. For Windows and Linux, 32- and 64-bits. Optimized for multi-threading. Works with sparse or dense input features.
Python based reader for SnowMicroPen® .pnt measurements
SnowMicroPyn is an opensource reader for SnowMicroPen® files in binary .pnt format. The reader is Python based and published as source code as well as compiled executable for Windows, Linux and Mac. The software is preliminary, thus please report bugs. For further information, please read the Readme.txt in the download section. The SnowMicroPen® (SMP) is developed by the WSL Institute for Snow and Avalanche Research, Davos. The SMP is the first high-resolution snow penetrometer. It is capable of measuring the bonding force between snow grains, with both high spatial resolution and high speed. The SMP can be used in different application areas as snow profiling (avalanche forecasting, snow stratigraphy, remote sensing ground truth), ski track characterization (ski racing) or snow runway characterization (stability testing).
Fuzzy C Means
FuMé is planned to be a framework for unsupervised learning, based on the Fuzzy C-Means algorithm, incorporating various methods for determining optimal parameters - number of clusters, fuzzifier, norm.
The project was born in FASTA University, Argentina, with the goal of studding keystroke dynamics patterns of users by analyzing their digraphs/trigraphs. Our research is aimed at detecting and differentiating users by analyzing its typing pattern.
HaxHexEditor is a file search system which search into multiple files (batch) with multiple definable searchstrings. The searchstrings can be specified by multiple wildcards. The search results will be exported in the HexWorkshop .HBK format
A tool for exporting Wikipedia data
A simple tool, created in a French project "CoMeRe", for exporting data from all Wikimedia fundation projects, include Wikipedia. To use this, you need prepare two things: Target dump of Wiki Target page name Then, configure the tool with dump path and input the page name, finally just let's it run, you will get the target page. Duration is dependent on size of the dump, and the number of pages. Take care for using option "talk, other talk...", this is now only for french corpora.