NETWORK ADMINISTRATIONSWindows server

Ollama does Windows?!?

Yup, Ollama is now on Windows. Native. Not just WSL2. And it’s working pretty well. It’s a preview release, but this is going to get a lot of folks excited.

This video was recorded on GCP with Windows Server 2022. I also tried the steps with Windows 10 Pro on Paperspace and Windows 11 Pro on Azure. Other than the new task bar on Win11 they all looked and performed exactly the same.

source

windows server

Alice AUSTIN

Alice AUSTIN is studying Cisco Systems Engineering. He has passion with both hardware and software and writes articles and reviews for many IT websites.

23 thoughts on “Ollama does Windows?!?

  • Thanks for updating.. I was struggling with WSL and almost give up, then your vid show up on my youtube homepage. XD

  • Hi Matt, can we use ollama without GPU? If yes, how?

  • Yay! I have a Windows Gaming Laptop that I have been dying to try Ollama on because of the GPU. This is going to be Soooo much better! Thanks guys!

  • Glad to see this, though already running it in Docker and not sure what the advantage of switching to native is considering I don't have an NVidia GPU.

  • This is great. Thank you! I do have a question. When I run LM Studio (windows) I get a URL that I can use for the API to call the model from my python applications. I'm assuming that Ollama has the same functionality. How do I get that setup, and where do I find the URL to plug into my scripts? Thanks again!

  • What?! YAY!!! Thank you very much Ollama wanna play 🙂

  • I haven't had time to check out ollama yet, what are the advantages of ollama over oobabooga? Is it mostly ease use? Does it have significant features that justify jumping over from oobabooga? I already have 100s of GBs of models downloaded, can I convert them to your blob format without having to re-download? Those are just my initial questions. I'll be checking it out regardless.

  • There are directions that work, and those that do not, AI is difficult and lots of fake stuff. Trusting you is so easy, you are GOLD!

  • Hi! I just downloaded and replaced my wsl instance of ollama for the Windows one. Awesome! But… I kinda would like to use my OLLAMA_MODELS env var just like on Linux and maybe use the already downloaded models… 😇

  • Hi! I come from using textgen webui, how does it compare to Ollama? What parameters like, temp, rep penalty etc does it use? Also, what system prompt does it use?

  • Matt, I want to move the model blobs off of my C: drive to another drive where I have more free disk space. Does the Windows Ollama support the OLLAMA_MODELS environment variable somehow? And is there a way to confirm that Ollama is detecting my NVIDIA gpu?

  • now how to have code assistant in vs code an ollama in windows?

  • Thanks for bringing the good news! Just installed it yesterday on WSL2, guess I'll reinstall it natively now.

  • Great work. Now can you create more channels for the discord? One is a mess.

  • This week I got Ollama running under WSL. That was a fun exercise. And today I just installed Ollama for Windows. Having more os options is great!!

  • Thanks for giving me enough time to get hooked on linux

Comments are closed.