[k8s] Error: Internal error: Remote error: Could not reach quorum of 1 #478
Labels
No Label
AdminAPI
Bug
Check AWS
CI
Correctness
Critical
Documentation
Ideas
Improvement
Low priority
Newcomer
Performance
S3 Compatibility
Testing
Usability
No Milestone
No Assignees
3 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: Deuxfleurs/garage#478
Loading…
Reference in New Issue
No description provided.
Delete Branch "%!s(<nil>)"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Hi, after installing garage via helm chart - it started succesfully and, for example, garage status - works fine, but when i want to create bucket or key - it drops such error:
and gives me warns:
is there's any additional steps to work in k8s?
So, the problem was in
replicaCount: 1
seems like there's should be more than 1 copy of garage
i'll leave this issue open until fix for single-node cluster
I am not the author of the helm chart, and I don't know how Kubernetes does things, but can you confirm that your node has a zone and capacity assigned in garage status ? I.e. that a layout has been created and applied, even with just one node
@lx
No, it wasn't, also, garage didn't offer me to create layout with single node, but after changing
replicaCount
to 2 - garage offered me creation of layout which successfully fixed that issueThe default configuration for the garage helm chart specify 3 replicas for the pods, and 3 for the replication mode as well. If you change the number of replicas, you have to adjust the config provided to garage to change the
replication_mode
to something compatible with the number of pods that you are running.@maximilien yeah, but I set
replicaCount
andreplication_mode
to 1 from the first launch for helm chartfresh install single replica, replication_mode = 1
I don't understand this issue. Are the helm chart files wrong? If so, can you fix them and make a PR?
Thanks, I'll see if I can reproduce this on my side.
Closing for inactivity