run AI on your laptop….it's PRIVATE!!
🔥🔥Join the NetworkChuck Academy!: https://ntck.co/NCAcademy
☕☕ COFFEE and MERCH: https://ntck.co/coffee
#AI #aiserver #ollama #llama3
by NetworkChuck
linux web server
🔥🔥Join the NetworkChuck Academy!: https://ntck.co/NCAcademy
☕☕ COFFEE and MERCH: https://ntck.co/coffee
#AI #aiserver #ollama #llama3
by NetworkChuck
linux web server
Comments are closed.
Hopefully yall running 64 gb ram.
Is it only for laptops or PCs ?
Fuck me you are great at telling us about things…that have existed for a LONG LONG TIME
Hey Chuck what do you think about Microsoft new feature called Recall that will record everything you do on your computer.
Why did it start giving results before you finished asking lmao
I got obsessed with this the other week but it's kidna hard and I wanted to combine modules bur that don't work make another how to but new
❤
Yea I know last week I integrated it to my BASH chatbot, very sweet I recommend model: mistral 7b
But they dont run on veryblow cpu and ram pls make a video how to install other private llms or others that can work on low cpu and ram plsssss😊
What are the disadvantages of this? Security issues?
For Windows it doesn't work
Useless it's generation is very slow without more gpu power
iam following your tutorial on the complete video its great but my AMD Radeon RX6550M is not one of the supported GPU
Is it wokified?
Can I use this in my app??
It's slow
I use ollama with llama3
"No internet required"
BRO HOW DO I DOWNLOAD THAN
Does it need Server
What is GPU
Is it possible to run an Ollama model locally in python??
Like ChatGPT?.. you mean the 175B parameter one ? XD.. maybe a 2B gemma
if you're using an atomic linux like fedora silverblue. you can use homebrew, or just put it in docker or podman.
Who says AI nerds don't pull?
Found his video a lil over a month a go… Installed my WSL for my window 11 Lenovo and then installed Ollama.. I have a legal chat bot as well a a dolphin unsecured llama…it's pretty neat
I can only run w my CPU..lil slow but cool 😎
Can i download it my phone!?
It is reasonable only if you have cheap energy and a strong device. Otherwise, you will pay the huge bill and have a slow response.
nice
I miss "the 8 gb is not enough" statement. Nowadays it's very common.
Did you say you learned how from your own video?
How am I gonna generate images on my laptop which has no gpu
What about Suno AI!?
chatGPT just grabs stuff off the internet.
Yeah that video is BS because you need to have port 8080 free to run open web UI
Thank you! Can I change the name?🤔
I love it! Except for one thing… You might want to mention that 8GB of ram is not nearly enough. It kind of killed my Apple M2.
pulling now lets gooooo
will definitely be checking out the full vid mentioned
YESSSS thank you!
my gpu doesnt support ollama😭
Anyone know a solution so i can run ai and create api for it, like what kind of hosting i need and if you have a cheap one
But less accurate bcoz our laptop cannot handle the computatuon too big