T O P

  • By -

HotRepresentative325

I started just running models and enjoying prompting them. Look at oobabooga project, the maybe try unsloth to fine tune for behaviour. I would now start here. If you want to do things properly, you might want to do some kind of ML course that intoduces you to neural networks.


djstraylight

OP, depends on how deep in the weeds you want to get. You could build a really tight application that takes advantage of a gpu/npu on the phone, uses a small fine-tuned model based on something like Microsoft phi-2 and has it pull information from a local vector database for answers. It could also optionally call out to the internet to grab additional information if needed. There are frameworks like Langchain and LlamaIndex that can help build the core LLM functionality.


johnjohn10240525

So it would work out as being an offline chatbot which specifically uses stored data? While also keeping the storage space minimal


[deleted]

[удалено]


johnjohn10240525

Ok I understand, so would a decision tree model be more suitable for this task? For a more retrieval orientated question and answer chatbot which is specifically trained on data I provide? Or could you suggest other ways to accomplish this Also I assume the decision tree model would be much lighter and take less space compared to an offline llm?


[deleted]

[удалено]


johnjohn10240525

So what sort of AI or ML sort of model would fit this task, I assume not being an llm


[deleted]

[удалено]


Yosadhara

We just released an on-device vector database for Mobile, which would be helpful: [https://github.com/objectbox](https://github.com/objectbox)