Showing 2 open source projects for "wikipedia"

View related business solutions
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Context for your AI agents Icon
    Context for your AI agents

    Crawl websites, sync to vector databases, and power RAG applications. Pre-built integrations for LLM pipelines and AI assistants.

    Build data pipelines that feed your AI models and agents without managing infrastructure. Crawl any website, transform content, and push directly to your preferred vector store. Use 10,000+ tools for RAG applications, AI assistants, and real-time knowledge bases. Monitor site changes, trigger workflows on new data, and keep your AIs fed with fresh, structured information. Cloud-native, API-first, and free to start until you need to scale.
    Try for free
  • 1

    irit_diff_sequences

    Python tool to create lifespan sequences from Wikipedia edits history

    A Python tool which produced lifespan sequences from edits history. The tool is first developed for the Wikipedia edits history but can easily be adapted for others applications. From a database containing for each article its list of revisions, produce one csv file per article containing authored sequences and lifespans. Output format: i,j,lifespan,author with - i : begining of the chars sequence - j : end of the chars sequence - lifespan : number of edits the sequence has survives until the lattest revision - author : author id of the sequence.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    wiki export tool

    wiki export tool

    A tool for exporting Wikipedia data

    A simple tool, created in a French project "CoMeRe", for exporting data from all Wikimedia fundation projects, include Wikipedia. To use this, you need prepare two things: Target dump of Wiki Target page name Then, configure the tool with dump path and input the page name, finally just let's it run, you will get the target page. Duration is dependent on size of the dump, and the number of pages. Take care for using option "talk, other talk...", this is now only for french corpora.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next