??? Blogpost - Building Multimodal AI in TypeScript
First, clone the project with the command below
git clone https://github.com/weaviate-tutorials/next-multimodal-search-demo
The repository lets you do three things
Note that the first time you run it, Docker will download ~4.8GB multi2vec-bind Weaviate module, which contains the ImageBind model.
To start the Weaviate instance, run the following command, which will use the docker-compose.yml
file.
docker compose up -d
Create a Weaviate instance on Weaviate Cloud Services as described in this guide
.env
file and add the following keysGOOGLE_API_KEY
(you can get this in your Vertex AI settings)WEAVIATE_ADMIN_KEY
(you can get this in your Weaviate dashboard under sandbox details)WEAVIATE_HOST_URL
(you can get this in your Weaviate dashboard under sandbox details)Before you can import data, add any files to their respective media type in the
public/
folder.
With your data in the right folder, run yarn install
to install all project dependencies and to import your data into Weaviate and initialize a collection, run:
yarn run import
this may take a minute or two.
Make sure you have your Weaviate instance running with data imported before starting your Next.js Web App.
To run the Web App
yarn dev
... and you can search away!!
Learn more about multimodal applications
Some credit goes to Steven for his Spirals template