localgcp
One binary, nine services, zero cloud bills. Develop against GCP locally, including Vertex AI with real local inference.
google.golang.org/genai code
talks to Gemma, Llama, or any Ollama model. Same SDK, zero code changes, no API keys.
Nine services. One process.
Your GCP client libraries already support emulator host env vars. Point them at localhost and your existing code works with zero changes.
generateContent and embeddings via Ollama. Proxy to Gemma, Llama, or any local model. Stub mode for CI/CD.
Bucket and object CRUD. Simple, multipart, and resumable uploads. Signed URLs. JSON and XML API paths.
Topics, subscriptions, publish, pull, StreamingPull. Push subscriptions and dead letter topics.
Document CRUD, real-time listeners (onSnapshot), queries with in/array-contains. Transactions.
Secrets with versioning. Enable, disable, destroy states. Access by version number or "latest" alias.
Queue and task CRUD. HTTP target dispatch. Scheduling, retry with exponential backoff.
Encrypt/decrypt, asymmetric sign, HMAC. KeyRing and CryptoKey management. In-memory keys.
Write and query log entries. Filter by severity, text payload, log name. Bounded in-memory store.
Service CRUD with immediate operations. Auto-generated URIs. No polling required.
Three commands to local GCP
Install
Single binary. No Docker, no JVM, no runtime dependencies.
$ brew install slokam-ai/tap/localgcp
# or: go install github.com/slokam-ai/localgcp/cmd/localgcp@latest
Start
All nine services in the foreground. For Vertex AI with local models, start Ollama first.
$ ollama pull gemma3 # optional: for Vertex AI
$ localgcp up --vertex-model-map="gemini-2.5-flash=gemma3"
Connect
Sets emulator host env vars. Your GCP client libraries do the rest.
$ eval $(localgcp env)
$ go run ./your-app
Before and after
Without localgcp
- Spin up real GCP dev projects
- Pay cloud bills for test environments
- Burn Vertex AI credits on every prompt test
- Manage individual emulators per service
- No Cloud Storage or Cloud Tasks emulator from Google
- Leak API keys in CI/CD for AI tests
- Requires internet connection
With localgcp
- One binary, starts in milliseconds
- Zero cloud bills for dev and CI
- Run Vertex AI against local Gemma/Llama
- All services in one process
- Every service included, even the missing ones
- No API keys needed, ever
- Works fully offline
Start building locally
Nine services. 160+ tests. MIT licensed. Works with Go, Python, Java, Node.js.
View on GitHub or: read the docs · download a binary