In this practical, we are going to explore how you can play with large language models through APIs or using local models. You will need the following libraries below.
To install them, you can run the code pip install gpt4all chromadb langchainhub
Make sure that the following code can run.
# Library to use LLMs
from gpt4all import GPT4All
# Usual data representation and manipulation libraries
import pandas as pd
import numpy as np
# Langchain functions
from langchain.document_loaders import WebBaseLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.vectorstores import Chroma
from langchain.embeddings import GPT4AllEmbeddings
from langchain.llms import GPT4All as langChainGPT4All
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
# HuggingFace Inference API
from huggingface_hub import InferenceClient
from pathlib import Path
import IPython.display as ipd
# Libraries for topic modelling
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.decomposition import NMF
import pyLDAvis
import pyLDAvis.lda_model
The requirements are below: