AI Text Generators for Linux

View 10 business solutions
  • Top-Rated Free CRM Software Icon
    Top-Rated Free CRM Software

    216,000+ customers in over 135 countries grow their businesses with HubSpot

    HubSpot is an AI-powered customer platform with all the software, integrations, and resources you need to connect your marketing, sales, and customer service. HubSpot's connected platform enables you to grow your business faster by focusing on what matters most: your customers.
    Get started free
  • Bright Data - All in One Platform for Proxies and Web Scraping Icon
    Bright Data - All in One Platform for Proxies and Web Scraping

    Say goodbye to blocks, restrictions, and CAPTCHAs

    Bright Data offers the highest quality proxies with automated session management, IP rotation, and advanced web unlocking technology. Enjoy reliable, fast performance with easy integration, a user-friendly dashboard, and enterprise-grade scaling. Powered by ethically-sourced residential IPs for seamless web scraping.
    Get Started
  • 1
    PerlPP

    PerlPP

    Perl preprocessor - embed Perl source in any file

    Translates Text+Perl to Text. It can be used for any kind of text templating, e.g. code generation. No external modules are required, just a single file. Requires Perl 5.10.1+. PerlPP runs in two passes: it generates a Perl script from your input, and then it runs the generated script. If you see error at (eval ##) (for some number ##), it means there was an error in the generated script. The -D switch defines elements of %D. If you do not specify a value, the value true (a constant in the generated script) will be used. The following commands work mostly analogously to their C preprocessor counterparts. but $fn can be determined programmatically. Note that defines set with -D or -s do not take effect until after the script has been generated, which is after the macro code runs.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    Pipeline for training Language Models

    Pipeline for training Language Models

    Pipeline for training Language Models using PyTorch.

    Pipeline for training Language Models using PyTorch. Inspired by Yandex Data School NLP Course (week 03: Language Modeling) Prepared text file with space-separated words on each line.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Python Client For NLP Cloud

    Python Client For NLP Cloud

    NLP Cloud serves high performance pre-trained or custom models for NER

    NLP Cloud serves high performance pre-trained or custom models for NER, sentiment-analysis, classification, summarization, dialogue summarization, paraphrasing, intent classification, product description and ad generation, chatbot, grammar and spelling correction, keywords and keyphrases extraction, text generation, image generation, blog post generation, source code generation, question answering, automatic speech recognition, machine translation, language detection, semantic search, semantic similarity, tokenization, POS tagging, embeddings, and dependency parsing. It is ready for production, served through a REST API. You can either use the NLP Cloud pre-trained models, fine-tune your own models, or deploy your own models.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    Regex

    Regex

    Generate matching and non matching strings based on regex patterns

    Generate matching and non-matching strings. This is a java library that, given a regex pattern, allows to generation of matching strings. Iterate through unique matching strings. Generate not matching strings. Follow the link to Online IDE with created project: JDoodle. Enter your pattern and see the results. By design a+, a* and a{n,} patterns in regex imply an infinite number of characters should be matched. When generating data, that would mean values of infinite length might be generated. It is highly doubtful anyone would require a string of infinite length, thus I've artificially limited repetitions in such patterns to 100 symbols when generating random values. Use a{n,m} if you require some specific number of repetitions. It is suggested to avoid using such infinite patterns to generate data based on regex.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Software Testing Platform | Testeum Icon
    Software Testing Platform | Testeum

    Testeum is a Software Testing & User Test platform

    Tired of bugs and poor UX going unnoticed despite thorough internal testing? Testeum is the SaaS crowdtesting platform that connects mobile and web app creators with carefully selected testers based on your criteria.
    Learn More
  • 5
    ShortGPT Lite

    ShortGPT Lite

    Get short and concise answers from GPT 3/GPT 4

    Short GPT Lite is a simple tool for Windows/Linux based on OpenAI's GPT3/GPT4 large language model. The main focus is to get quick and concise answers from GPT. ShortGPT is now available on Android : https://play.google.com/store/apps/details?id=io.github.rupeshs.shortgpt_lite ShortGPT basic web version is now available try it for free: https://nolowiz.com/shortgpt-get-short-and-concise-answers-from-gpt-for-free/
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    TFKit

    TFKit

    Handling multiple nlp task in one pipeline

    TFKit is a tool kit mainly for language generation. It leverages the use of transformers on many tasks with different models in this all-in-one framework. All you need is a little change of config. You can use tfkit for model training and evaluation with tfkit-train and tfkit-eval. The key to combine different task together is to make different task with same data format. All data will be in csv format - tfkit will use csv for all task, normally it will have two columns, first columns is the input of models, the second column is the output of models. Plane text with no tokenization - there is no need to tokenize text before training, or do re-calculating for tokenization, tfkit will handle it for you. No header is needed.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Texar-PyTorch

    Texar-PyTorch

    Integrating the Best of TF into PyTorch, for Machine Learning

    Texar-PyTorch is a toolkit aiming to support a broad set of machine learning, especially natural language processing and text generation tasks. Texar provides a library of easy-to-use ML modules and functionalities for composing whatever models and algorithms. The tool is designed for both researchers and practitioners for fast prototyping and experimentation. Texar-PyTorch was originally developed and is actively contributed by Petuum and CMU in collaboration with other institutes. A mirror of this repository is maintained by Petuum Open Source. Texar-PyTorch integrates many of the best features of TensorFlow into PyTorch, delivering highly usable and customizable modules superior to PyTorch native ones. Texar-PyTorch (this repo) and Texar-TF have mostly the same interfaces. Both further combine the best design of TF and PyTorch. Data processing, model architectures, loss functions, training and inference algorithms, evaluation, etc.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Text Gen

    Text Gen

    Almost state of art text generation library

    Almost state of art text generation library. Text gen is a python library that allow you build a custom text generation model with ease. Something sweet built with Tensorflow and Pytorch(coming soon). Load your data, your data must be in a text format. Download the example data from the example folder. Tune your model to know the best optimizer, activation method to use.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    TextBox

    TextBox

    A text generation library with pre-trained language models github.com

    TextBox 2.0 is an up-to-date text generation library based on Python and PyTorch focusing on building a unified and standardized pipeline for applying pre-trained language models to text generation. From a task perspective, we consider 13 common text generation tasks such as translation, story generation, and style transfer, and their corresponding 83 widely-used datasets. From a model perspective, we incorporate 47 pre-trained language models/modules covering the categories of general, translation, Chinese, dialogue, controllable, distilled, prompting, and lightweight models (modules). From a training perspective, we support 4 pre-training objectives and 4 efficient and robust training strategies, such as distributed data parallel and efficient generation. Compared with the previous version of TextBox, this extension mainly focuses on building a unified, flexible, and standardized framework for better supporting PLM-based text generation models.
    Downloads: 0 This Week
    Last Update:
    See Project
  • The #1 Embedded Analytics Solution for SaaS Teams. Icon
    The #1 Embedded Analytics Solution for SaaS Teams.

    Qrvey saves engineering teams time and money with a turnkey multi-tenant solution connecting your data warehouse to your SaaS application.

    Qrvey’s comprehensive embedded analytics software enables you to design more customizable analytics experiences for your end users.
    Try Developer Playground
  • 10
    TextGen

    TextGen

    textgen, Text Generation models

    Implementation of Text Generation models. textgen implements a variety of text generation models, including UDA, GPT2, Seq2Seq, BART, T5, SongNet and other models, out of the box. UDA, non-core word replacement. EDA, simple data augmentation technique: similar words, synonym replacement, random word insertion, deletion, replacement. This project refers to Google's UDA (non-core word replacement) algorithm and EDA algorithm, based on TF-IDF to replace some unimportant words in sentences with synonyms, random word insertion, deletion, replacement, etc. method, generating new text and implementing text augmentation This project realizes the back translation function based on Baidu translation API, first translate Chinese sentences into English, and then translate English into new Chinese. This project implements the training and prediction of Seq2Seq, ConvSeq2Seq, and BART models based on PyTorch, which can be used for text generation tasks such as text translation.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    abstract2paper

    abstract2paper

    Auto-generate an entire paper from a prompt or abstract using NLP

    Enter your abstract into the little doohicky here, and quicker'n you can blink your eyes1, a shiny new paper'll come right out for ya! What are you waiting for? Click the "doohicky" link above to get started, and then click the link to open the demo notebook in Google Colaboratory. To run the demo as a Jupyter notebook (e.g., locally), use this version instead. Note: to compile a PDF of your auto-generated paper (when you run the demo locally), you'll need to have a working LaTeX installation on your machine (e.g., so that pdflatex is a recognized system command). The notebook will also automatically install the transformers library if it's not already available in your local environment. In its unmodified state, the demo notebooks use the abstract from the GPT-3 paper as the "seed" for a new paper. Each time you run the notebook you'll get a new result.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    artikelschreiber

    artikelschreiber

    Frontend and Backend Code for ArtikelSchreiber.com and UNAIQUE.NET

    Frontend and Backend Code for ArtikelSchreiber.com and UNAIQUE.NET Text Generator deutsch - Dein KI Text Generator kostenlos mit Künstlicher Intelligenz The Software as a Service can be found here: SEO Optimizer: Ghost Writer - Hausarbeiten schreiben mit KI and KI Text Generator This product includes software developed by Sebastian Enger, M.Sc. Copyright (c) 2023, Sebastian Enger, M.Sc. All rights reserved. Frontend and Backend Source Code for Project: https://github.com/sebastianenger1981/ https://www.artikelschreiber.com/ https://www.artikelschreiben.com/ https://www.unaique.net/
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    commit-autosuggestions

    commit-autosuggestions

    A tool that AI automatically recommends commit messages

    This is implementation of CommitBERT: Commit Message Generation Using Pre-Trained Programming Language Model. CommitBERT is accepted in ACL workshop : NLP4Prog. Have you ever hesitated to write a commit message? Now get a commit message from Artificial Intelligence! CodeBERT: A Pre-Trained Model for Programming and Natural Languages introduces a pre-trained model in a combination of Program Language and Natural Language(PL-NL). It also introduces the problem of converting code into natural language (Code Documentation Generation). We can use CodeBERT to create a model that generates a commit message when code is added. However, most code changes are not made only by add of the code, and some parts of the code are deleted. We plan to slowly conquer languages that are not currently supported. To run this project, you need a flask-based inference server (GPU) and a client (commit module). If you don't have a GPU, don't worry, you can use it through Google Colab.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    hebrew-gpt_neo

    hebrew-gpt_neo

    Hebrew text generation models based on EleutherAI's gpt-neo

    Hebrew text generation models based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made available to me via the TPU Research Cloud Program. The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    hfapigo

    hfapigo

    Unofficial (Golang) Go bindings for the Hugging Face Inference API

    (Golang) Go bindings for the Hugging Face Inference API. Directly call any model available in the Model Hub. An API key is required for authorized access. To get one, create a Hugging Face profile.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    node-markov-generator

    node-markov-generator

    Generates simple sentences based on given text corpus

    This simple generator emits short sentences based on the given text corpus using a Markov chain. To put it simply, it works kinda like word suggestions that you have while typing messages in your smartphone. It analyzes which word is followed by which in the given corpus and how often. And then, for any given word it tries to predict what the next one might be. Here you create an instance of TextGenerator passing an array of strings to it - it represents your text corpus which will be used to "train" the generator. The more strings/sentences you pass, the more diverse results you get, so you'd better pass like hundreds of them, or even more! If you have your texts in an external file, you can pass the path to it as an argument for TextGenerator's constructor.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    node-red-contrib-custom-chatgpt
    A Node-RED node that interacts with OpenAI machine learning models like "ChatGPT". Install with the built-in Node-RED Palette manager. When editing the properties of the node, to get your OPENAI_API_KEY log in to ChatGPT. Create a new secret key" then copy and paste the "API key" into the node API_KEY property value. msg.payload should be a well-written prompt that provides enough information for the model to know what you want and how it should respond. Its success generally depends on the complexity of the task and quality of your prompt. A good rule of thumb is to think about how you would write a word problem for a middle schooler to solve. msg.payload should be a well-written prompt that provides enough information for the model to know what you want and how it should respond.
    Downloads: 0 This Week
    Last Update:
    See Project