Menu

#99 No module named 'llama_index.response'

open
nobody
None
2024-04-18
2024-02-22
Anonymous
No

Originally created by: laurenceanthony

After installing everything from the requirements.txt file, I'm running into the following problem:
No module named 'llama_index.response'

Is the code set up for an old version of llama_index?

Discussion

  • Anonymous

    Anonymous - 2024-02-23

    Originally posted by: yjg30737

    As a temporary measure, I've just removed llama-index package in pyqt-openai which causes error because as you said, the code is old version. Usage of llama-index is significantly changed.

    I will create tag to figure out how to implement "new" llama-index in this package.

     
  • Anonymous

    Anonymous - 2024-02-23

    Originally posted by: laurenceanthony

    Sounds good. Once I get a working version here, I'll see if I can contribute to it.

     
  • Anonymous

    Anonymous - 2024-04-18

    Originally posted by: yjg30737

    It is bit too late (which is an understatement) but i applied new llamaindex codes into feature/llamaindex branch just a couple of days ago. If there are no issues, I will merge it this weekend.

     

Log in to post a comment.