43 programs for "user mode linux" with 2 filters applied:

  • Red Hat Ansible Automation Platform on Microsoft Azure Icon
    Red Hat Ansible Automation Platform on Microsoft Azure

    Red Hat Ansible Automation Platform on Azure allows you to quickly deploy, automate, and manage resources securely and at scale.

    Deploy Red Hat Ansible Automation Platform on Microsoft Azure for a strategic automation solution that allows you to orchestrate, govern and operationalize your Azure environment.
  • Achieve perfect load balancing with a flexible Open Source Load Balancer Icon
    Achieve perfect load balancing with a flexible Open Source Load Balancer

    Take advantage of Open Source Load Balancer to elevate your business security and IT infrastructure with a custom ADC Solution.

    Boost application security and continuity with SKUDONET ADC, our Open Source Load Balancer, that maximizes IT infrastructure flexibility. Additionally, save up to $470 K per incident with AI and SKUDONET solutions, further enhancing your organization’s risk management and cost-efficiency strategies.
  • 1
    J-Obey is a Java Library/package, which allows people writing their own crawlers to have a stable Robots.txt parser, if you are writing a web crawler of some sort you can use J-Obey to take out the hassle of writing a Robots.txt parser/intrepreter.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    Funnel is a project for use on intranets, or selected sites on the Internet to gather together and index information from several different sources and make it available through a sane, usable interface.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Larbin is a Web crawler intended to fetch a large number of Web pages, it should be able to fetch more than 100 millions pages on a standard PC with much u/d. This set of PHP and Perl scripts, called webtools4larbin, can handle the output of Larbin and p
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    WebLoupe is a java-based tool for analysis, interactive visualization (sitemap), and exploration of the information architecture and specific properties of local or publicly accessible websites. Based on web spider (or web crawler) technology.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Total Network Visibility for Network Engineers and IT Managers Icon
    Total Network Visibility for Network Engineers and IT Managers

    Network monitoring and troubleshooting is hard. TotalView makes it easy.

    This means every device on your network, and every interface on every device is automatically analyzed for performance, errors, QoS, and configuration.
  • 5
    A multi-threaded web spider that finds free porn thumbnail galleries by visiting a list of known TGPs (Thumbnail Gallery Posts). It optionally downloads the located pictures and movies. TGP list is included. Public domain perl script running on Linux.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Nomad is tiny but efficient search engine and web crawler. This works very good for searching with in the set of corporate websites on internet and/or intranet's HTML documents or knowledge repositories.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    A new Web Crawler including sophisticated searching process especialized by language !
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Web Textual eXtraction Tools C++ Parallel web crawler, noun phrase idenification, Multi-lingual Part of Speech Tagging, Tarjan's Algorithm, Co-RelationShip Mappings...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Larbin is an HTTP Web crawler with an easy interface that runs under Linux. It can fetch more than 5 million pages a day on a standard PC (with a good network).
    Downloads: 2 This Week
    Last Update:
    See Project
  • Easy management of simple and complex projects Icon
    Easy management of simple and complex projects

    We help different businesses become digital, manage projects, teams, communicate effectively and control tasks online.

    Plan more projects with Worksection. Use Gantt chart and Kanban boards to organize your projects, get your team onboard and assign tasks and due dates.
  • 10
    Arachnid is a Java-based web spider framework. It includes a simple HTML parser object that parses an input stream containing HTML content. Simple Web spiders can be created by sub-classing Arachnid and adding a few lines of code called after each page
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    WebSPHINX is a web crawler (robot, spider) Java class library, originally developed by Robert Miller of Carnegie Mellon University. Multithreaded, tollerant HTML parsing, URL filtering and page classification, pattern matching, mirroring, and more.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Harvest is a web indexing package, originally disigned for distributed indexing, it can form a powerful system for indexing both large and small web sites. Also now includes Harvest-NG a highly efficient, modular, perl-based web crawler.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    We are integrating existing communication systems including Wiki, IRC, Instant Messaging, e-mail, and even static web sites. We write web scrapers and servers for managing events, IRC bots, logs, local names, templates, and groups.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    FWebSpider is a web crawler application written on Perl. It performs chosen site crawl, featuring response cache, URL storage, URL exclusion rules and more. It is developed to function as a local/global site search engine core.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    Webhunter is a distributed, multi-threaded web crawler designed for both general indexing and crawling the web for focused content.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    ApeSmit is a very simple Python module to create XML sitemaps as defined at http://www.sitemaps.org. ApeSmit doesn’t contain any web spider or something like that, it just writes the data you provide to a file using the proper syntax.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    arachne is a C++ library for HTTP crawling, link, text and metadata extraction designed to run in a distributed environment.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    Purpose of SAWS is to facilitate process of web scraping by - 1) providing a pattern specification mechanism on top of normal regular expressions 2) and implementation of common matching algorithm to run specified pattern on given source for any matches.
    Downloads: 0 This Week
    Last Update:
    See Project