Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jedisct1
25 days ago
|
parent
|
context
|
favorite
| on:
Show HN: Sweep, Open-weights 1.5B model for next-e...
Really cool.
But how to use it instead of Copilot in VSCode ?
flanked-evergl
24 days ago
|
next
[–]
Would love to know myself, I recall there was some plugin for VSCode that did next edits that accepted a custom model but I don't recall what it was now.
replete
24 days ago
|
prev
[–]
Run server with ollama, use Continue extension configured for ollama
BoredomIsFun
24 days ago
|
parent
[–]
I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.
mika6996
24 days ago
|
root
|
parent
[–]
But you can't just switch between installed models like in ollama, can you?
BoredomIsFun
24 days ago
|
root
|
parent
[–]
llama-swap?
https://www.nijho.lt/post/llama-nixos/
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
But how to use it instead of Copilot in VSCode ?