More Premium Hugo Themes Premium Tailwind Themes

Farfalle

AI search engine - self-host with local or cloud LLMs

Farfalle

AI search engine - self-host with local or cloud LLMs

Author Avatar Theme by rashadphz
Github Stars Github Stars: 3309
Last Commit Last Commit: Sep 27, 2024 -
First Commit Created: Aug 27, 2024 -
default image

Overview:

Farfalle is an open-source AI-powered search engine that offers the capability of running local models or using cloud models for search functionalities. It supports various search providers and allows users to ask questions to both cloud models and local models, enhancing the search experience.

Features:

  • Multiple Search Providers: Utilize search capabilities from different providers like Tavily, Searxng, Serper, and Bing.
  • Answer Questions with Models: Ability to answer questions with cloud models like OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3, or local models like llama3, mistral, gemma, phi3.
  • Custom LLM Support: Users can leverage custom Language Model Models (LLMs) through LiteLLM for tailored search experiences.
  • Agent-assisted Search: Conduct searches with an agent that plans and executes the search for better and accurate results.

Installation:

Getting Started Locally:

  1. Prerequisites:

    • Docker
    • Ollama (if using local models)
    • Download and start any supported local models: llama3, mistral, gemma, phi3
    • Start the ollama server: ollama serve
    • Get API keys for optional providers like Tavily, Serper, OpenAI, Bing, Groq
  2. Quick Start:

    • Modify the .env file with API keys (optional for Ollama)
    • Start the app and visit http://localhost:3000
    • For custom setup, refer to the custom-setup-instructions.md file.

Deploy:

  1. Backend:

  2. Frontend:

    • Use the copied backend URL as NEXT_PUBLIC_API_URL when deploying with Vercel
    • Deploy the frontend with Vercel
  3. Using Farfalle as a Search Engine:

    • Configure Farfalle as your default search engine in your browser settings
    • Create a new search engine entry using this URL: http://localhost:3000/?q=%s

Summary:

Farfalle is a versatile search engine that combines the power of AI models with multiple search providers. Users can benefit from the ability to query cloud models or local models, along with the option to integrate custom Language Models through LiteLLM. With agent-assisted searches and support for various search providers, Farfalle offers a comprehensive search experience for users looking for accurate and efficient results.