Course Practicalities

Tools and Environments


Main tools

The examples in this course are intentionally mainly SDK-agnostic. To run the examples on your computer, you’ll need:

The examples are designed to run locally from the command line. We do not e.g. build a web user interface.

Deno is also used in the Web Software Development course. If you have already taken that course, you are already familiar with the runtime and its tools.

The course materials use JavaScript’s raw Fetch API and JSON requests instead of provider-specific SDKs. Whenever possible, we also try to conform with the OpenResponses specification. This keeps the examples more portable and makes the surrounding software logic easier to inspect.

The graded programming exercises are also written so that the tests use stubs or local fixtures. This means that you can complete and test the assignment logic without depending on an external provider or needing to use an API key.

Running programs

You will commonly use the following commands:

$ deno run main.js
$ deno run --allow-read main.js
$ deno test
$ deno fmt
$ deno add jsr:@std/path@1.1.4

The permission flags matter. If a program reads files, accesses the network, or reads environment variables, you must allow those actions explicitly.

Loading Exercise...

Permissions flags and program model

The permission flags help reason about the program.

If a program needs --allow-read, then file input is part of its behavior. If it needs --allow-net, then some part of the program communicates with an external service. If it needs --allow-env, then the program depends on local configuration outside the source file. These flags make dependencies visible in a way that helps both debugging and review.

When working with LLM generated code, you should also become accustomed to asking “What permissions does this need, and why?”. This is one of the many ways used to build an understanding of what the application actually does.

Environment variables

In later chapters, some command-line applications call LLM APIs. In those examples, configuration is read from environment variables such as LLM_API_URL, LLM_API_KEY, and LLM_MODEL.

In a Unix-like shell, environment variables can be set like this:

$ export LLM_API_URL="https://api.example.com/v1/responses"
$ export LLM_API_KEY="your-api-key"
$ export LLM_MODEL="example-model"

Then, inside a Deno program, the values can be read with Deno.env.get(...).

const apiKey = Deno.env.get("LLM_API_KEY");

If your program reads environment variables, you must run it with --allow-env.

Using environment variables this way has two advantages. First, the same code can be run against different providers or models without changing the source file. Second, secrets such as API keys do not need to be hard-coded into the repository.

Treat API keys like passwords. Do not hard-code them into source files, do not commit them into version control, and do not paste them into screenshots or discussion threads.

Optional access to external LLM services

Some examples can be followed as worked examples even if you do not have access to an external LLM API. However, to run the later CLI chat examples yourself as full end-to-end examples with a real LLM, you will need access to a provider that offers a compatible text-generation API.

Depending on your setup, you might run local models with e.g. Ollama, or use a free tier account from HuggingFace that allows experimenting with some of the models. You can create an account on HuggingFace and use their APIs. This is not a requirement or an expectation, however.

In real projects, teams often do switch to SDKs once they have committed to a provider. Such examples include OpenAI’s SDK, Anthropic’s SDK, HuggingFace.js, and so on. These SDKs are often more ergonomic than raw fetch calls, but they are also more vendor-specific.

Later framework-variant chapters in this course show the same idea one step further with LangChainJS, which builds on top of provider integrations rather than replacing the need to understand the underlying engineering choices.