With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.
You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
Try free now
Cloud tools for web scraping and data extraction
Deploy pre-built tools that crawl websites, extract structured data, and feed your applications. Reliable web data without maintaining scrapers.
Automate web data collection with cloud tools that handle anti-bot measures, browser rendering, and data transformation out of the box. Extract content from any website, push to vector databases for RAG workflows, or pipe directly into your apps via API. Schedule runs, set up webhooks, and connect to your existing stack. Free tier available, then scale as you need to.
WPCleaner is a small portable tool designed to help on maintenance tasks (mainly disambiguation and check) on Wikipedia.
Source code has been moved to github
https://github.com/WPCleaner/wpcleaner
Inventors: Validate Your Idea, Protect It and Gain Market Advantages
SenseIP is ideal for individual inventors, startups, and businesses
senseIP is an AI innovation platform for inventors, automating any aspect of IP from the moment you have an idea. You can have it researched for uniqueness and protected; quickly and effortlessly, without expensive attorneys. Built for business success while securing your competitive edge.
This extension is now hosted at Github. https://github.com/garbear/facebook-mediawiki
Facebook for MediaWiki is an extension to the Wikipedia software for integrating the Facebook Connect experience into public and personal wikis. Please see http://www.mediawiki.org/wiki/Extension:Facebook
A Wikipedia bot (a computer program designed to make automated edits to Wikipedia). It is written in Standard C - C99 - using the standard libraries and libcurl.
This project has not met its long-term goals, and has been abandoned.
Companies searching for an Employer of Record solution to mitigate risk and manage compliance, taxes, benefits, and payroll anywhere in the world
With G-P's industry-leading Employer of Record (EOR) and Contractor solutions, you can hire, onboard and manage teams in 180+ countries — quickly and compliantly — without setting up entities.
Adds the ability to navigate within and between pages to the standard WIkipedia interface. The project won Best-in-contest for the AVIOS Speech Application contest in 2010.
A stand-alone editor using Mediawiki markup language to generate HTML code. You can create and preview pages written using Mediawiki markup (i.e. Wikipedia pages) while off-line.
Automatically embed Wikipedia topic information into PDF documents via pop up annotations. This relies on the Wikipedia Miner service that is also available on Sourceforge.
A Python library and collection of tools that automate work on MediaWiki sites. Originally designed for Wikipedia, it is now used throughout the Wikimedia Foundation's projects and on many other MediaWiki wikis including wikidata service.
We do not use sourceforge.net anymore, but are very grateful for their support in the past. Please see our website at https://www.mediawiki.org/wiki/Pywikibot
Our pypi package could be found at https://pypi.org/project/pywikibot/
WagTools is an open source Firefox extension for Wikipedia editors. It provides some similar functionality to other Wikipeidia tools like Huggle and AWB.
This program will allow Wikipedia users to tag each article. The cloud created will be displayed using a visual map. Users can then see the relation between articles that contain similar content even if they belong in different categories.
MWQL is an extension to MediaWiki, providing (end) users with a language for structural queries, so that they can build dynamic pages as seen in the Special pages of Wikipedia.
A collection of python scripts to create and handle an XML corpus (a large collection of text for linguistic purpose) from an original Wikipedia database backup dump. It includes a regular expression based parser for the MediaWiki markup language.