ChatGPT is a marvel of research and engineering. But it’s much much more than just that. ChatGPT has had a far more significant impact than just being an exceptional feat of technology. It has, probably for the first time, made AI accessible to everyone, not just AI/ML practitioners but everyone. LLMs are a household name now. You sit in a coffee-shop, and you hear ChatGPT can do it. 3 years ago it would most surely be a bunch of techies talking about their next project. Today, it could be architects or designers or (true story) doctors or lawyers talking about a quick something with AI.

The series:

Goals

In the spirit of AI for everyone let’s build a quick desktop application to interface with the very capable LLaMA3 8B model by Meta AI.

At the end of the series our desktop app should look like this:

final

In Part I of this series, we are going to be working on the setup required to create a desktop application in rust using Tauri - a cross-platform desktop app toolkit built on Rust.

Who is this for?

You’ll feel right at home if you are a programmer, have some exposure to Rust and a bit of experience working with Svelte, React or any other modern client-side framework.

So, without further ado, let’s begin.

Tools and Libraries

TL;DR

Github repo

Setup

Setting up tauri

Assuming you already have the rust toolchain installed including cargo, we are going to use cargo to install the tauri-cli tool.

cargo install tauri-cli

Setting up SvelteKit project

Creating a SvelteKit project

Let’s start by creating a Svelte project

npm create svelte@latest instruct
Note
instruct is the project name, you can choose anything you fancy.

Svelte will ask for some config to get you started, here are my configurations:

  1. For the template option I’ll go with a skeleton project - basically the bare-minimum stuff required to work with Svelte. regular

  2. At the next step, we’ll choose Typescript tauri setup

  3. For the rest of the options, let’s go with the defaults, not important for us now

Configuring the SvelteKit project

Now, that the project is created in the directory instruct - let’s go in and complete a little more boilerplate stuff.

cd instruct

We are going to be using the SvelteKit SSG (Static Site Generation) mode. There are other modes like SSR (Server-side Rendered) mode but that’s not the topic of our post here, so I’ll not go deep into it. To run our desktop app client in SSG mode, we’ll need to install @sveltejs/adapter-static:

npm install --save-dev @sveltejs/adapter-static

Feel free to dig a little deeper into @sveltejs/adapter-* here.

You can read about the SSR mode of SvelteKit here.

With @sveltejs/adapter-static added to our project, lets configure SvelteKit to use that, change the import from @sveltejs/adapter-auto to @sveltejs/adapter-static in instruct/svelte.config.js.

svelte.config.js
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
import adapter from '@sveltejs/adapter-static'; // changed from `adapter-auto`
import { vitePreprocess } from '@sveltejs/vite-plugin-svelte';

/** @type {import('@sveltejs/kit').Config} */
const config = {
	// Consult https://kit.svelte.dev/docs/integrations#preprocessors
	// for more information about preprocessor
	preprocess: vitePreprocess(),

	kit: {
		// adapter-auto only supports some environments, see https://kit.svelte.dev/docs/adapter-auto for a list.
		// If your environment is not supported, or you settled on a specific environment, switch out the adapter.
		// See https://kit.svelte.dev/docs/adapters for more information about adapters.
		adapter: adapter()
	}
};

export default config;

At this stage if we run

npm run dev

and navigate to localhost:5173 in your browser, we should see a welcome message from your new SvelteKit app. tauri setup

One last thing as a part of the boilerplate - use your favorite editor to add a file +layout.ts to instruct/src/routes/ and tell the SvelteKit module to not use SSR.

src/routes/+layout.ts
1
2
export const prerender = true
export const ssr = false

Note: For those new to SvelteKit and the +layout.ts or +page.ts feels alien to you - have a quick peek at the SvelteKit Docs which discuss the structure and routing of SvelteKit apps.

Setting up the instruct project for Tauri

What we have achieved till now is very close to what we would do for our any of our web application projects. Client-side developers would have done these a thousand times. Let’s change the game and introduce tauri to make our web-app a desktop application.

Initialize a tauri project

Inside our project root (in our case the instruct directory), lets initialize the tauri app. We are basically telling the tauri to take over from here and be responsible for the application.

cargo tauri init

The tauri-cli will ask for a bunch of options step-by-step. We’ll need to modify the option for What is the URL of your dev server?. We are using SvelteKit defaults, so the URL of our dev server would be localhost:5173. We’ll leave the other options as is. tauri setup

One last thing to complete the setup, we’ll need to update instruct/src-tauri/tauri.conf.json and update tauri.bundle.identifier field - this will come in handy when we build our application in Part III of our series. Let’s give it some unique name, we are going to be calling it instruct.llm.

instruct/src-tauri/tauri.conf.json
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
{
	"build": { /* ... omitted ... */ },
	"package": { /* ... omitted ... */ },
	"tauri": {
		"allowlist": { "all": true },
		"bundle": {
			"active": true,
			"category": "DeveloperTool",
			// .. fields omitted ...
			"identifier": "instruct.llm",
			"longDescription": "",
			"macOS": { /* ... omitted ... */ },
			"resources": [],
			// ... omitted ...
		},
		"security": { /* ..fields omitted.. */ },
		// ... more ...
	}
}

That completes our basic setup for any Tauri + Sveltekit desktop app, but the the project/ directory structure feels daunting at this stage. Let’s break it down.

Directory Structure of our Project instruct

Our project directory structure should be looking like the following: project structure

Read the comments associated with the relevant parts of the structure below, that should give us a good grip on the purpose and arrangement of this structure.

- instruct // this is our project root
	- .svelte-kit // SvelteKit specific stuff here, we'll not touch these
	- node_modules // we all know what this is
	- src // this is the working directory for your svelte app
		- lib // any library files you write for the client side will be kept here. SvelteKit can identify `$lib` as the import path
		- routes // `+page.ts`, `+layout.ts` etc. will be kept here, this is how SvelteKit routing works
		// We'll not touch the following yet. SvelteKit doc should give you a good idea of what these are
		- app.d.ts 
		- app.html
	- src-tauri // this is the `rust` project root, notice that this contains your `Cargo.toml` and also our `src/main.rs`
		- src // the `rust` `src` directory
			- main.rs // our entrypoint into this application
		- build.rs // tauri uses this build script to initialize and build the `tauri` app
		- Cargo.toml // the rust project, dependencies etc.
		- tauri.conf.json // `tauri` specific config
	- static // some static assets like `favicon` etc can be kept here
	- package.json // our SvelteKit project dependencies and definitions
	- svelte.config.json // Svelte project specific config, we had updated this to use `@sveltejs/adapter-static` during the early setup
	- tsconfig.json // configurations for `Typescript`
	- vite.config.json // Svelte uses `vite` for bundling and a bunch of other tooling, configurations for `vite`

Now, armed with the understanding of our directory/ project structure, lets run our app.

cargo tauri dev

And there you go, you have the Welcome Page of SvelteKit running as a desktop app served by Tauri. desktop application init Coming from a web-application development background, this was a euphoric moment for me. My own, native-ish desktop application.

Wrapping up & Next steps:

In this post we set up the required scaffolding for running our Desktop Application and we ran it. In the next post, Part II of this series, we are going to be working on the backend of our application, initialization of the model and run a test inference. In Part III of this series we are going to write some client side interface and put it all together with the communication layer. So, ride along ..

Before we close today ..

If you have found this post helpful consider spreading the word, it would act as a strong motivator for me to create more. If you found an issue or hit a snag, reach out to me @beingAnubhab.

Acknowledgements & reference

This project is built on the shoulder of stalwarts, a huge shout-out to all of them

  1. Rust maintainers for this awesome language
  2. The tauri app and it’s creators and maintainers
  3. Meta for creating LLaMA family of models and giving open-source AI a fair shot
  4. HuggingFace🤗 for everything they do
  5. Georgi Gerganov for creating GGML/ GGUF movement
  6. llama.cpp maintainers for moving at breakneck speed
  7. llama_cpp-rs maintainers for the awesome yet simple crate
  8. Quant Factory Team for the GGUF model files

And many, many more …