Build gen AI apps with an all-in-one modern database: MongoDB Atlas
MongoDB Atlas provides built-in vector search and a flexible document model so developers can build, scale, and run gen AI apps without stitching together multiple databases. From LLM integration to semantic search, Atlas simplifies your AI architecture—and it’s free to get started.
Start Free
Cloud-based help desk software with ServoDesk
Full access to Enterprise features. No credit card required.
What if You Could Automate 90% of Your Repetitive Tasks in Under 30 Days? At ServoDesk, we help businesses like yours automate operations with AI, allowing you to cut service times in half and increase productivity by 25% - without hiring more staff.
Sanchay is a collection of tools and APIs for language researchers. It has some implementations of NLP algorithms, some flexible APIs, several user friendly annotation interfaces and Sanchay Query Language for language resources.
Facilitates data mining/natural language processing experiments to be executed on weblogs, such as classification, clustering and rating. As part of these experiments, it is possible to apply Latent Semantic Analysis.
NLPTools-ES is a Spanish plugin for GATE (General Architecture for Text Engineering). It includes a tokenizer, sentence splitter, gazetteer, pos tagger.
QuickAI (pronounced, "quickeye", or just "Quick" for short) is a return to the fundamental goals of creating an artificial intelligence. The priorities are to implement core models of knowledge and knowing, a reasoning engine, and a simple interface.
It's a modern take on desktop management that can be scaled as per organizational needs.
Desktop Central is a unified endpoint management (UEM) solution that helps in managing servers, laptops, desktops, smartphones, and tablets from a central location.
TBLTools is a set of GATE processing resources that implements the Fast Transformation Based Learning Algorithm. You can train it to learn rules for NLP tasks such as Named Entity Recognition and Shallow parsing.
JWNL is a Java API for accessing the WordNet relational dictionary. WordNet is widely used for developing NLP applications, and a Java API such as JWNL will allow developers to more easily use Java for building NLP applications.
This is a suite of several software agents to provide a complete architecture of lexical base as proposed in Didier Schwab's PhD. thesis. It will be used for automatic translation, information retrieval and other natural language processing tasks.
The program provides Java interface (to C++ Lemmatizer via XML-RPC) in order to perform lemmatizing in Russian, English, and German (lemma is the canonical form of a lexeme in Natural Language Processing). RussianPOSTagger could work as a module of GATE.
MutationFinder is a biomedical natural language processing (NLP) system for extracting mentions of point mutations from free text. MutationFinder achieves high performance (99% precision, 81% recall on blind test data) as an information extraction system
OrangeHRM provides a world-class HRIS experience and offers everything you and your team need to be that HR hero you know that you are.
Give your HR team the tools they need to streamline administrative tasks, support employees, and make informed decisions with the OrangeHRM free and open source HR software.
JWebPro: A Java tool that can interact with Google search and then process the returned Web documents in a couple of ways. The outputs can serve as inputs for NLP, IR, infor extraction, Web mining, online social network extraction/analysis applications.
NOTE: I couldn't keep up this project to align with latest Unicode spec. Not sure I may be continuing. You can try Myanmar3 from Myanmar NLP or WinUniInnwa or https://sourceforge.net/projects/prahita/ or something better compliant font. ~Victor
---
[This is UniBurma - UniMM project workshop area. This project currently have two productions, UniBurma and UniMM. For more descriptive info about this project, please visit http://unimm.org/. You can browse lastest source from SVN trunk.]
JVnSegmenter is a Java-based and open-source Vietnamese word segmentation tool. The segmentation model was trained on about 8,000 sentences using Conditional Random Fields (FlexCRFs). This tool would be useful for Vietnamese NLP community.
AutoSummary uses Natural Language Processing to generate a contextually-relevant synopsis of plain text. It uses statistical and rule-based methods for part-of-speech tagging, word sense disambiguation, sentence deconstruction and semantic analysis.
CoPT, Corpus Processing Tools, is a set of java classes intended to assist field linguists, NLP researchers and developers, students and software developers in all corpus-related processing.
The nlpFarm is a Natural Language Processing (NLP) resource where early research prototypes (Java) can evolve into robust and useful open source. Our farmstead collaborates under the OpenNLP initiative, in order to make NLP software publically available.
uNLPBot is a chatter bot based on Natural Language Processing theory, able to parse small but representative subsets of english natural language and to produce english sentences compliant to english grammar and related to conversation threads.
Grok is a library of natural language processing components, including support for parsing with categorial grammars and various preprocessing tasks such as part-of-speech tagging, sentence detection, and tokenization.
WhiteBeer is a new programming paradigm that is motivated by Noam Chomsky's Minimalist Program. It uses feature checking mechanism to parse programs so that it can provide word-order-independence like natural language processing.
Metafrastes is a natural language processing system that can take an input query from the user and translate that internally in order to communicate with a knowledge base in the form of an ontology.
NXTAI's main objective is to create a new generation of natural language processing program by using a unique neural networking model. To demonstrate the algorithm works, A chatbot will be developed as a classical example of how it can be used.
This project aims at creating rationally thinking agents. The agent gather information through command line or network and stores it in its memory. It uses Stanford's NLP library to understand the language statements.