Our members are constantly sharing generative AI tools and resources with the community. Feel free to join if you would like to learn and collaborate with them. Otherwise, keep this page bookmarked, as we’ll frequently update it.
LLM Frameworks & Tools
- LangChain – a framework for developing applications powered by language models
- OpenAI Cookbook – a comprehensive guide to using the OpenAI’s API.
- Includes example code and guides for a variety of tasks, such as generating text, translating languages, writing different kinds of creative content, and answering your questions in an informative way.
- Spacy (Repo)- open-source library for advanced Natural Language Processing (NLP) in Python, designed for production use.
- It enables developers to build applications that can process and understand large volumes of text for various purposes, such as information extraction, natural language understanding, or text preprocessing for deep learning. spaCy offers functionalities like tokenization, part-of-speech tagging, dependency parsing, lemmatization, named entity recognition, entity linking, similarity, text classification, rule-based matching, training, and serialization.
- More detail
- FastAPI (Repo) – a modern, fast (high-performance), web framework for building APIs with Python 3.8+ based on standard Python type hints.
- LangSmith (Repo) – a platform for building production-grade LLM applications.
- It lets you debug, test, evaluate, and monitor chains and intelligent agents built on any LLM framework and seamlessly integrates with LangChain, the go-to open source framework for building with LLMs.
- Intro article
- AutoLLM – Imagine a tool that combines the benefits of LangChain, LlamaIndex, & LiteLLM, and you can see the benefits of AutoLLM
- MemGPT – manages the memory tiers in LLMs. Looks like they pulled this off with fancy RAG work. Should be helpful for perpetual conversations.
- llamafile lets you distribute and run LLMs with a single file, and the feedback on Hacker News has been fantastic.
- LLMLingua is like a zip file for your prompt inputs and outputs, compressing any request down to the minimum token size while retaining the same semantics.
- C-Nedelcu – talk to ChatGPT AI using your voice and listen to its answers through a voice
- Webpilot – allows users to provide a URL or URLs and make requests for interacting with, extracting specific information, or modifying the content from the URL.
- LangUI – open source Tailwind library with free to use components tailored for your AI and GPT projects. Focus on building the next best project and let it handle the UI.
Generative AI Applications
- Firmograph – Chrome’s #1 company research copilot. Firmograph empowers your sales force by ensuring no opportunity is overlooked. They provide real-time updates and actionable insights on compelling events like leadership changes, funding rounds, and critical company news, keeping you ahead of the competition.
- Dust – Secure AI assistant with your company’s knowledge
- ChainForge (Repo) – Open-source visual programming environment for prompt engineering.
- With ChainForge, you can evaluate the robustness of prompts and text generation models in a way that goes beyond anecdotal evidence.
- InfraNodus (Repo) – Generate insights with AI and knowledge graphs
- FlowiseAI (Repo) – Open source UI visual tool to build your customized LLM flow using LangchainJS, written in Node
- OneAI – an NLP-as-a-service platform. Their APIs let developers analyze, process, and transform language input in their code. No training data or NLP/ML knowledge are required.
- Godmode AI – Web Based UI for a multi-agent implementation
- Onboard AI – Quickly understand and navigate a new codebase with AI.
Build Your Own LLMs & Applications
- Member Daniel Daugherty built a homegrown LLM on an M1 MacBook Pro.
- Software Engineer at LangChain built a homemade LLM-powered app that is hosted locally on a 16 GB M2 MacBook Pro
- Talk to your documents as PDFs, txts, and even web pages
- Apple released AIM on Hugging Face. The image models are inspired by LLMs, and exhibit similar scaling properties.
- According to their white paper, even at the scale of 7 billion parameters on 2 billion images, they saw no signs in saturation of performance.
