What's the cheapest option right now for me to spin up a Linux server somewhere for an hour with enough GPU to run the latest Mixtral model?
@doodlestein @simonw I was under the impression vast ai is unsecured U are basically running code on some randos computer- he can see everything
@doodlestein @simonw Network bandwidth has been an issue for me on Vast for short-term workloads. It ends up costing more than just using Lambda Labs because models and dependencies take so long to download. Vast has internet speed metric for machines, but I've found it to be hit or miss.
@doodlestein @simonw We also have datacenter hosts as well with equipment in a datacenter with ISO 27001 (and other) certifications.
@doodlestein @simonw Yeah but instances are pretty unstable. Never know when one will just up and stop working. Still the cheapest option by far imho