Running Ollama on Windows ! Ubuntu Version | (Much Awaited Video)
In this video, we are going to run Ollama on Windows System
Steps for Running Ollama on Windows System:
Step 1: Turn Windows Features on or off (Virtual Machine Platform and Windows System for Linux)
Step 2: Go to Command Prompt and say “wsl –install”
Step 3: In the Command Prompt type “wsl –user root -d ubuntu”
Step 4: Type curl https://ollama.ai/install.sh | sh
Step 5: Type “ollama serve”
Step 6: Type “ollama run mistral” or any model of our choice
Note: in order to stop everything, just say “wsl –terminate ubuntu”
This is going to be a fun video.
Ollama: https://ollama.ai/
Let’s do this!
Join the AI Revolution!
#ai #ubuntu #ollamaonwindows #ollamamac #llm_selector #auto_llm_selector #localllms #github #streamlit #langchain #qstar #openai #ollama #webui #github #python #llm #largelanguagemodels
CHANNEL LINKS:
☕ Buy me a coffee: https://ko-fi.com/promptengineer
❤️ Subscribe: https://www.youtube.com/@PromptEngineer48
💀 GitHub Profile: https://github.com/PromptEngineer48
🔖 Twitter Profile: https://twitter.com/prompt48
🤠Join this channel to get access to perks:
https://www.youtube.com/channel/UCX6c6hTIqcphjMsXbeanJ1g/join
TIME STAMPS:
🎁Subscribe to my channel: https://www.youtube.com/@PromptEngineer48
If you have any questions, comments or suggestions, feel free to comment below.
🔔 Don’t forget to hit the bell icon to stay updated on our latest innovations and exciting developments in the world of AI!
ubuntu
great. I have done the same thing a while back and works perfectly.
Clarification at 4:15, when it pulls it downloads then runs models locally?
My speeds seem much slower than your demo. Tested with Windows machine 8gb ram and 4gb Nvidia card
Awaited ?!
WSL is been used for quite some time now … WSL 1 was first released on August 2, 2016
Since June 2019, WSL 2 is available to Windows 10
So we (people) have been running virtual Ubuntus for years
I don't know…
Is it possible now to do this : https://youtu.be/lhQ8ixnYO2Y?si=NJe7P9a1zReBVvFm
In windows? How is the procedure?
Windows build should come very soon! One of the latest pull requests added the last missing pieces for windows support and I was able to compile it myself including CUDA support (install nvidia cuda toolkit beforehand).
One variant without the hassle of compilation would be using docker desktop on windows and running the ollama container. But this requires a windows version with hyper-v support (that means Pro or better afaik)
Excellent video. Thank you for sharing this workaround.
Thank you for this.. I appreciate it.
This is awesome.
this is fascinating but why use it this way over LM studio?
Great a workaround!