Visual Studio Code Installation
Table of Contents
1. Installation
-
Open the appropriate OS section:
Expand for Windows
-
Download the installer from here.
Expand for Rhel
-
Original install doc can be referenced here.
-
Steps:
-
Install the key and repo
sudo rpm --import https://packages.microsoft.com/keys/microsoft.asc -
Install the repo
sudo sh -c 'echo -e "[code]\nname=Visual Studio Code\nbaseurl=https://packages.microsoft.com/yumrepos/vscode\nenabled=1\ngpgcheck=1\ngpgkey=https://packages.microsoft.com/keys/microsoft.asc" > /etc/yum.repos.d/vscode.repo' -
Update the package cache
dnf check-update -
Install
sudo dnf install code
-
Expand for Ubuntu
-
Original install doc can be referenced here.
-
If Snap is installed then:
sudo snap install --classic code
Expand for Fedora
-
If Snap is installed then:
sudo snap install --classic code
-
2. Configuration
2.1. Edit Config
-
Add the following to the VSCode settings file:
-
Open the config file for editing:
Unix Config File location~/.config/Code/User/settings.jsonYou may need to create the folder first with:
mkdir -p ~/.config/Code/UserWindows Config File location%APPDATA%\Code\User\settings.json -
Add these key-value pairs:
New entries"workbench.sideBar.location": "right", "files.autoSave": "afterDelay", "window.titleBarStyle": "native", "asciidoc.preview.useEditorStyle": false, "workbench.colorTheme": "Visual Studio Light", "workbench.iconTheme": "material-icon-theme", "editor.codeLens": falseThe key window.titleBarStyle entry will fix the SSH X11 forwarding Window Resize bug.
-
3. Extension Configuration
3.1. Continue
-
In the
Primary Side Bar, click on the Continue Icon. -
Click on the Assistants drop down button.

-
Click on the
Local AssistantConfig button to open theyamlconfig file.
-
Create a
yamlconfig. Reference is here.:Example yaml config filename: Local Ollama Assistant version: 1.0.0 schema: v1 models: - name: Ollama-Phi4 apiBase: http://192.168.1.2:11434/ provider: ollama model: phi4:latest roles: - autocomplete - chat - edit - apply defaultCompletionOptions: temperature: 0.3 - name: Ollama-DeepSeek-R1-14b apiBase: http://192.168.1.2:11434/ provider: ollama model: deepseek-r1:14b roles: - autocomplete - chat - edit - apply defaultCompletionOptions: temperature: 0.3 - name: Ollama-DeepSeek-Coder-R2-16b apiBase: http://192.168.1.2:11434/ provider: ollama model: deepseek-coder-v2:16b roles: - autocomplete - chat - edit - apply defaultCompletionOptions: temperature: 0.3 context: - provider: code - provider: docs - provider: diff - provider: terminal - provider: problems - provider: folder - provider: codebase