Showing 1664 open source projects for "python-telegram-bot"

View related business solutions
  • Grafana: The open and composable observability platform Icon
    Grafana: The open and composable observability platform

    Faster answers, predictable costs, and no lock-in built by the team helping to make observability accessible to anyone.

    Grafana is the open source analytics & monitoring solution for every database.
    Learn More
  • Gen AI apps are built with MongoDB Atlas Icon
    Gen AI apps are built with MongoDB Atlas

    Build gen AI apps with an all-in-one modern database: MongoDB Atlas

    MongoDB Atlas provides built-in vector search and a flexible document model so developers can build, scale, and run gen AI apps without stitching together multiple databases. From LLM integration to semantic search, Atlas simplifies your AI architecture—and it’s free to get started.
    Start Free
  • 1
    LinkChecker

    LinkChecker

    Check links in web documents or full websites

    LinkChecker is a free, GPL licensed website validator. LinkChecker checks links in web documents or full websites. It runs on Python 3 systems, requiring Python 3.8 or later. The version in the pip repository may be old, to find out how to get the latest code, plus platform-specific information and other advice see doc/install.txt in the source code archive. If you do not want to install any additional libraries/dependencies you can use the Docker image which is published on GitHub Packages.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    Selectolax

    Selectolax

    Python binding to Modest and Lexbor engines

    A fast HTML5 parser with CSS selectors using Modest and Lexbor engines. Selectolax supports two backends: Modest and Lexbor. By default, all examples use the Modest backend. Most of the features between backends are almost identical, but there are still some differences. Currently, the Lexbor backend is in beta and missing some of the features. To use lexbor, just import the parser and use it in the similar way to the HTMLParser.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    OnionShare

    OnionShare

    Securely and anonymously share files of any size

    OnionShare is an open source tool that allows you to securely and anonymously share files of any size, host websites, and chat with friends using the Tor network. There's no need for middlemen that could very well violate the privacy and security of the things you share online. With OnionShare, you can share files directly with just an address in Tor Browser. OnionShare works because it is accessible as a Tor Onion Service. All you need to do is open it and drag and drop the files you...
    Downloads: 4 This Week
    Last Update:
    See Project
  • 4
    django CMS

    django CMS

    Easy-to-use and developer-friendly enterprise CMS powered by Django

    Create modern websites that content editors love. django CMS was originally conceived by web developers frustrated with the technical and security limitations of other systems. Its lightweight core makes it easy to integrate with other software and put to use immediately, while its ease of use makes it the go-to choice for content managers, content editors and website admins. Developers can integrate other existing Django applications rapidly, or build brand new compatible apps that take...
    Downloads: 5 This Week
    Last Update:
    See Project
  • Resolve Support Tickets 2x Faster​ with ServoDesk Icon
    Resolve Support Tickets 2x Faster​ with ServoDesk

    Full access to Enterprise features. No credit card required.

    What if You Could Automate 90% of Your Repetitive Tasks in Under 30 Days? At ServoDesk, we help businesses like yours automate operations with AI, allowing you to cut service times in half and increase productivity by 25% - without hiring more staff.
    Try ServoDesk for free
  • 5
    Scout Suite

    Scout Suite

    Multi-cloud security auditing tool

    Scout Suite is an open-source multi-cloud security-auditing tool, which enables security posture assessment of cloud environments. Using the APIs exposed by cloud providers, Scout Suite gathers configuration data for manual inspection and highlights risk areas. Rather than going through dozens of pages on the web consoles, Scout Suite presents a clear view of the attack surface automatically. Scout Suite was designed by security consultants/auditors. It is meant to provide a point-in-time...
    Downloads: 5 This Week
    Last Update:
    See Project
  • 6
    Graylog Ansible Role

    Graylog Ansible Role

    Ansible role which installs and configures Graylog

    Ansible role which installs and configures Graylog.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    dude uncomplicated data extraction

    dude uncomplicated data extraction

    dude uncomplicated data extraction: A simple framework

    Dude is a very simple framework for writing web scrapers using Python decorators. The design, inspired by Flask, was to easily build a web scraper in just a few lines of code. Dude has an easy-to-learn syntax. Dude is currently in Pre-Alpha. Please expect breaking changes. You can run your scraper from terminal/shell/command-line by supplying URLs, the output filename of your choice and the paths to your python scripts to dude scrape command.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Trafilatura

    Trafilatura

    Python & command-line tool to gather text on the Web

    Trafilatura is a Python package and command-line tool designed to gather text on the Web. It includes discovery, extraction and text-processing components. Its main applications are web crawling, downloads, scraping, and extraction of main texts, metadata and comments. It aims at staying handy and modular: no database is required, the output can be converted to various commonly used formats.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Scrapy-Redis

    Scrapy-Redis

    Redis-based components for Scrapy

    ...Scheduler + Duplication Filter, Item Pipeline, Base Spiders. Default requests serializer is pickle, but it can be changed to any module with loads and dumps functions. Note that pickle is not compatible between python versions. Version 0.3 changed the requests serialization from marshal to cPickle, therefore persisted requests using version 0.2 will not able to work on 0.3. The class scrapy_redis.spiders.RedisSpider enables a spider to read the urls from redis. The urls in the redis queue will be processed one after another, if the first request yields more requests, the spider will process those requests before fetching another url from redis.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Run applications fast and securely in a fully managed environment Icon
    Run applications fast and securely in a fully managed environment

    Cloud Run is a fully-managed compute platform that lets you run your code in a container directly on top of scalable infrastructure.

    Run frontend and backend services, batch jobs, deploy websites and applications, and queue processing workloads without the need to manage infrastructure.
    Try for free
  • 10
    hosts

    hosts

    Consolidate and extend hosts files from several well-curated sources

    Consolidating and extending hosts files from several well-curated sources. You can optionally pick extensions to block pornography, social media, and other categories. The unified hosts file is optionally extensible. Extensions are used to include domains by category. Currently, we offer the following categories: fakenews, social, gambling, and porn. Extensions are optional, and can be combined in various ways with the base hosts file. The combined products are stored in the alternates...
    Downloads: 7 This Week
    Last Update:
    See Project
  • 11
    Tabby Web

    Tabby Web

    An SSH/Telnet/Serial client in your browser

    Tabby Web brings a modern terminal experience to the browser by pairing a web UI with a backend gateway that brokers TCP connections over WebSockets. It aims to deliver an experience similar to the desktop Tabby terminal—sessions, profiles, and rich configuration—while being accessible anywhere through a login. The architecture splits concerns: a Django-based control plane manages users, auth, and configuration, while a gateway service handles network transport so browser clients can reach...
    Downloads: 6 This Week
    Last Update:
    See Project
  • 12
    Amazon Braket PennyLane Plugin

    Amazon Braket PennyLane Plugin

    A plugin for allowing Xanadu PennyLane to use Amazon Braket devices

    The Amazon Braket PennyLane plugin offers two Amazon Braket quantum devices to work with PennyLane. The Amazon Braket Python SDK is an open-source library that provides a framework to interact with quantum computing hardware devices and simulators through Amazon Braket. PennyLane is a machine learning library for optimization and automatic differentiation of hybrid quantum-classical computations. Once the Pennylane-Braket plugin is installed, the provided Braket devices can be accessed straight away in PennyLane, without the need to import any additional packages. ...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 13
    FastHX

    FastHX

    FastAPI server-side rendering with built-in HTMX support.

    FastHX is a high-performance HTTP and WebSocket server framework designed for Haxe, enabling fast and scalable web application development.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Tweepy

    Tweepy

    Twitter for Python

    An easy-to-use Python library for accessing the Twitter API. You can also use Git to clone the repository from GitHub to install the latest development version. The easiest way to install the latest version from PyPI is by using pip. Twitter requires all requests to use OAuth for authentication. The API class provides access to the entire twitter RESTful API methods.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    Ansible Role s3cmd

    Ansible Role s3cmd

    Ansible role for s3cmd. Available on Ansible Galaxy

    Role to install (by default) s3cmd on Debian/Ubuntu and EL systems. s3cmd is a popular s3 client.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    Ajenti 2

    Ajenti 2

    Ajenti Core and stock plugins

    ...Does not overwrite your config files, options and comments. All changes are non-destructive. Includes lots of plugins for system and software configuration, monitoring and management. Ajenti 2 is easily extensible using Python. Plugin development is quick and pleasant with Ajenti APIs. Write your first plugin. Pleasant to look at, satisfying to click and accessible anywhere from tablets and mobile. Small memory footprint and CPU usage. Runs on low-end machines, wall plugs, routers and so on.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    Toot

    Toot

    toot - Mastodon CLI & TUI

    Toot is a CLI and TUI tool for interacting with Mastodon instances from the command line.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    img2dataset

    img2dataset

    Easily turn large sets of image urls to an image dataset

    Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine. Also supports saving captions for url+caption datasets. Opt-out directives: Websites can pass the http headers X-Robots-Tag: noai, X-Robots-Tag: noindex , X-Robots-Tag: noimageai and X-Robots-Tag: noimageindex By default img2dataset will ignore images with such headers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19

    ipfetch-python

    Get your public IP Address on the Command Line

    This small but fine Python Snippet requests the public IP adress (supports IPv4 and IPv6) at myip.is and outputs the result. It might be useful to get your public IP adress on the command line, or even to check whether your internet connection is working. USAGE : python ipfetch.py
    Downloads: 6 This Week
    Last Update:
    See Project
  • 20
    proxy.py

    proxy.py

    Utilize all available CPU cores for accepting new client connections

    proxy.py is made with performance in mind. By default, proxy.py will try to utilize all available CPU cores to it for accepting new client connections. This is achieved by starting AcceptorPool which listens on configured server port. Then, AcceptorPool starts Acceptor processes (--num-acceptors) to accept incoming client connections. Alongside, if --threadless is enabled, ThreadlessPool is setup which starts Threadless processes (--num-workers) to handle the incoming client connections....
    Downloads: 2 This Week
    Last Update:
    See Project
  • 21
    Changelog CI

    Changelog CI

    Changelog CI is a GitHub Action that enables a project

    Changelog CI is a GitHub Action that enables a project to automatically generate changelogs. Changelog CI can be triggered on pull_request, workflow_dispatch, and any other events that can provide the required inputs. Changelog CI uses python and GitHub API to generate a changelog for a repository. First, it tries to get the latest release from the repository (If available). Then, it checks all the pull requests/commits merged after the last release using the GitHub API. After that, it parses the data and generates the changelog. It is able to use Markdown or reStructuredText to generate a Changelog. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    ScrapydWeb

    ScrapydWeb

    Web app for Scrapyd cluster management

    Web app for Scrapyd cluster management, with support for Scrapy log analysis & visualization. Make sure that Scrapyd has been installed and started on all of your hosts. Start ScrapydWeb via command scrapydweb. (a config file would be generated for customizing settings on the first startup.) Add your Scrapyd servers, both formats of string and tuple are supported, you can attach basic auth for accessing the Scrapyd server, as well as a string for grouping or labeling. You can select any...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 23
    Grab Framework Project

    Grab Framework Project

    Web Scraping Framework

    Grab is a python framework for building web scrapers. With Grab you can build web scrapers of various complexity, from simple 5-line scripts to complex asynchronous website crawlers processing millions of web pages. Grab provides an API for performing network requests and for handling the received content e.g. interacting with DOM tree of the HTML document.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    Dataproc Templates

    Dataproc Templates

    Dataproc templates and pipelines for solving simple in-cloud data task

    Dataproc templates are designed to address various in-cloud data tasks, including data import/export/backup/restore and bulk API operations. These templates leverage the power of Google Cloud's Dataproc, supporting both Dataproc Serverless and Dataproc clusters. Google provides this collection of pre-implemented Dataproc templates as a reference and for easy customization.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    WordOps

    WordOps

    Install and manage a high performance WordPress stack

    An essential toolset that eases WordPress site and server administration.
    Downloads: 0 This Week
    Last Update:
    See Project