Jump to content


Recommended Posts

Posted

rssImage-df5513ad7faf0d9835be1098290ce3db.webp

Most of us are used to using internet chatbots like ChatGPT and DeepSeek in one of two ways: via a web browser or via their dedicated smartphone apps. There are two drawbacks to this. First, their use requires an internet connection. Second, everything you type into the chatbot is sent to the companies’ servers, where it is analyzed and retained. In other words: the more you use the chatbot the more the company knows about you. This is a particular worry surrounding DeepSeek that American lawmakers have expressed.

But thanks to a few innovative and easy-to-use desktop apps, LM Studio and GPT4All, you can bypass both these drawbacks. With the apps, you can run various LLM models on your computer directly. I’ve spent the last week playing around with these apps and thanks to each, I can now use DeepSeek without the privacy concerns. Here’s how you can, too.

Run DeepSeek locally on your computer without an internet connection

To get started, simply download LM Studio or GPT4All on your Mac, Windows PC, or Linux machine. Once the app is installed, you’ll download the LLM of your choice into it from an in-app menu. I chose to run DeepSeek’s R1 model, but the apps support myriad open-source LLMs.

LM-Studio.pngLM Studio can run DeepSeek’s reasoning model privately on your computer.

Once you’ve done the above you’ve essentially turned your personal computer into an AI server capable of running numerous open-source LLMs, including ones from DeepSeek and Meta. Next, simply open a new chat window and type away just as you would when using an AI chatbot on the web.

The best thing about both these apps is that they are free for general consumer use, you can run several open-source LLMs in them (you get to choose which and can swap between LLMs at will), and, if you already know how to use an AI chatbot in a web browser, you’ll know how to use the chatbot in these apps.

But there are additional benefits to running LLM’s locally on your computer, too.

The benefits of using an LLM locally

I’ve been running DeepSeek’s reasoning model on my MacBook for the past week without so much as a hiccup in both LM Studio or GPT4All. One of the coolest things about interacting with DeepSeek in this way is that no internet is required. Since the LLM is hosted directly on your computer, you don’t need any kind of data connection to the outside world to use it.

GPT4All.pngRunning LLMs like DeepSeek in apps like GPT4All can help keep your data secure.

Or as GPT4All’s lead developer, Adam Treat, puts it, “You can use it on an airplane or at the top of Mount Everest.” This is a major boon to business travelers stuck on long flights and those working in remote, rural areas. 

But if Treat had to sum up the biggest benefit of running DeepSeek locally on your computer, he would do it in one word: “Privacy.”

“Every online LLM is hosted by a company that has access to whatever you input into the LLM. For personal, legal, and regulatory reasons this can be less than optimal or simply not possible,” Treat explains. 

While for individuals, this can present privacy risks, those who upload business or legal documents into an LLM to summarize could be putting their company and its data in jeopardy.

“Uploading that [kind of data] to an online server risks your data in a way that using it with an offline LLM will not,” Treat notes. The reason an offline LLM running locally on your own computer doesn’t put your data at risk is because “Your data simply never leaves your machine,” says Treat.

This means, for example, if you want to use DeepSeek to help you summarize that report you wrote, you can upload it into the DeepSeek model stored locally on your computer via GPT4All or LM Studio and rest assured the information in that report isn’t being sent to the LLM maker’s servers.

The drawbacks of using an LLM locally

However, there are drawbacks to running an LLM locally. The first is that you’re limited to using only the open-source models that are available, which may be less recent than the model that is available through the chatbot’s official website. And because only open-source models can be installed, that means you can’t use apps like GPT4All or LM Studio to run OpenAI’s ChatGPT locally on your computer.

Another disadvantage is speed. 

“Because you are using your own hardware (your laptop or desktop) to power the AI, the speed of responses will be generally slower than an online server,” Treat says. And since AI models rely heavily on RAM to perform their computations, the amount of RAM you have in your computer can limit which models you can install in apps like GPT4All and LM Studio.

“As online servers are usually powered by very high-end hardware they are generally going to be faster and have more memory allowing for very fast responses by very large models,” explains Treat.

Still, in my testing of both LM Studio and GPT4All over the past week, I don’t think the reduced speediness of DeepSeek’s replies is a dealbreaker. When using DeepSeek’s R1 reasoning model on the web, the DeepSeek hosted on servers in China took 32 seconds to return an answer to the prompt “Can you teach me how to make a birthday cake?” When asking the local DeepSeek R1 model stored in LM Studio and GPT4All, the response time was 84 seconds and 82 seconds, respectively. 

I’ve found that the benefits of running DeepSeek locally on my device using LM Studio and GPT4All far outweigh the extra waiting time required to get a response. Without a doubt, being able to access a powerful AI model like DeepSeek’s R1 locally on my computer anywhere at any time without an internet connection—and knowing the data I enter into it remains private—is a trade-off worth making.

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...