Menu

Tree [b0b637] main v0.2.3 /
 History

HTTPS access


File Date Author Commit
 .github 2024-03-13 samlhuillier samlhuillier [2cf9b9] rm linux node llama cpp build step
 .vscode 2023-11-01 Sam L'Huillier Sam L'Huillier [62a3cd] Initial commit
 build 2024-04-08 Owen McGrath Owen McGrath [b5ab35] 1024 x 1024 icns file.
 electron 2024-04-17 samlhuillier samlhuillier [21bf27] fix build issues
 public 2023-11-23 samlhuillier samlhuillier [9d4952] Revert "initial work making it a menu bar"
 scripts 2024-04-02 samlhuillier samlhuillier [bf3696] increase top margin
 src 2024-04-18 samlhuillier samlhuillier [0952fa] remove on file select
 .containerignore 2024-02-20 Ryan Delaney Ryan Delaney [8a1f78] Add Containerfile
 .eslintrc.js 2024-02-17 samlhuillier samlhuillier [b06cc3] cleanup code
 .gitignore 2024-04-01 samlhuillier samlhuillier [95ace7] add downloadOllama script
 .npmrc 2024-01-23 samlhuillier samlhuillier [103a61] add legacy peer deps
 .prettierrc 2024-03-10 samlhuillier samlhuillier [598794] fix to manage active listeners + prettier
 Containerfile 2024-02-28 Ryan Delaney Ryan Delaney [fdd692] Use the correct base image for electron builds
 LICENSE 2024-02-22 samlhuillier samlhuillier [062cd2] update license
 Makefile 2023-11-28 samlhuillier samlhuillier [184c95] add makefile
 README.md 2024-03-28 Sam L'Huillier Sam L'Huillier [852992] Update README.md
 electron-builder.json5 2024-03-12 samlhuillier samlhuillier [b4222a] consistency with binaries path
 index.html 2024-01-05 samlhuillier samlhuillier [aa0c51] add material to mikdown giving us nice icons fo...
 package-lock.json 2024-04-14 Chu You Chia Chu You Chia [d0309f] add flashcard UI and review mode
 package.json 2024-04-19 Chu You Chia Chu You Chia [b0b637] bump
 playwright.config.ts 2023-11-01 Sam L'Huillier Sam L'Huillier [62a3cd] Initial commit
 postcss.config.cjs 2023-11-01 Sam L'Huillier Sam L'Huillier [62a3cd] Initial commit
 postcss.config.js 2024-02-03 samlhuillier samlhuillier [f84281] move to tailwind classes for title bar height
 screenshots.md 2024-02-10 Sam L'Huillier Sam L'Huillier [23ea00] Update screenshots.md
 tailwind.config.js 2024-02-03 samlhuillier samlhuillier [f84281] move to tailwind classes for title bar height
 tsconfig.json 2023-11-01 Sam L'Huillier Sam L'Huillier [62a3cd] Initial commit
 tsconfig.node.json 2023-11-01 Sam L'Huillier Sam L'Huillier [62a3cd] Initial commit
 vite.config.ts 2023-11-01 samlhuillier samlhuillier [10bc04] tailwind baby

Read Me

Reor Project

A self-organizing AI note-taking app that runs models locally.

GitHub Downloads (all assets, all releases) Discord GitHub Repo stars

New: We are now accessible via discord, hop by to give ❤️feedback❤️ or discuss our upcoming features!

About

Reor is an AI-powered desktop note-taking app: it automatically links related notes, answers questions on your notes and provides semantic search. Everything is stored locally and you can edit your notes with an Obsidian-like markdown editor. The hypothesis of the project is that AI tools for thought should run models locally by default. Reor stands on the shoulders of the giants Ollama, Transformers.js & LanceDB to enable both LLMs and embedding models to run locally. Connecting to OpenAI or OpenAI-compatible APIs like Oobabooga is also supported.

How can it possibly be "self-organizing"?

  1. Every note you write is chunked and embedded into an internal vector database.
  2. Related notes are connected automatically via vector similarity.
  3. LLM-powered Q&A does RAG on the corpus of notes.
  4. Everything can be searched semantically.

One way to think about Reor is as a RAG app with two generators: the LLM and the human. In Q&A mode, the LLM is fed retrieved context from the corpus to help answer a query. Similarly, in editor mode, the human can toggle the sidebar to reveal related notes "retrieved" from the corpus. This is quite a powerful way of "augmenting" your thoughts by cross-referencing ideas in a current note against related ideas from your corpus.

https://github.com/reorproject/reor/assets/17236551/94a1dfeb-3361-45cd-8ebc-5cfed81ed9cb

Getting Started

  1. Download from reorproject.org or releases. Mac, Linux & Windows are all supported.
  2. Install like a normal App.

Running local models

Reor interacts directly with Ollama which means you can download and run models locally right from inside Reor. Head to Settings->Add New Local LLM then enter the name of the model you want Reor to download. You can find available models here.

You can also connect to an OpenAI-compatible API like Oobabooga, Ollama or OpenAI itself!

Importing notes from other apps

Reor works within a single directory in the filesystem. You choose the directory on first boot.
To import notes/files from another app, you'll need to populate that directory manually with markdown files. Note that if you have frontmatter in your markdown files it may not parse correctly. Integrations with other apps are hopefully coming soon!

Building from source

Make sure you have nodejs installed.

Clone repo:

git clone https://github.com/reorproject/reor.git

Install dependencies:

npm install

Run for dev:

npm run dev

Build:

npm run build

Contributions

Contributions are welcome in all areas: features, ideas, bug fixes, design, etc. This is very much a community driven project. There are some open issues to choose from. For new features, please open an issue to discuss it before beginning work on a PR :)

Folder Structure

The main components of the project are located in the following directories:

  • /electron: Contains the main process functions that manage all the filesystem interactions, LLMs, Embedding Models and the vector database.
  • /src: Contains the frontend of the application, which is a React app.

License

GPL-3.0 license. See LICENSE for details.

Reor means "to think" in Latin.