I have a very simple setup running Gitea which I love. However, I enabled Elastic Search because it makes searching much faster than the default method.
I have a VPS running 16GB memory. The only things running on it are Nginx, PHP, Mysql, docker, and a few other things. Very rarely I ever hit over 6GB usage.
The issue comes when I enable Elastic Search. It seems to wipe me out at 15.7GB usage out of 16GB as soon as I start it up.
I searched online and found out about the /etc/elasticsearch/jvm.options.d/jvm.options
and adding
-XmxXG
-XmsXG
The question is, what should this amount be. I read that by default, Elastic uses 50%, however, when I started it up, it was wiping me out of memory and making the system almost have a stroke.
But setting it to 2GB seems to make it not as responsive on the Gitea website, sometimes even timing the website out.
So I’m not sure what “range” I should be using here. Or if I’m going to have to upgrade my VPS to 32GB in order to run this properly.
If you already have Caddy running on that same Docker host then its very simple to add another proxy target to that through the Caddyfile.
I have been using Traefik for years and i am mostly happy with it but recently spend a day on trying out Caddy together with Authelia for authentication. Here is what came out of it as a very basic example to use them together. Its using a custom Docker image for Caddy that contains a few extra modules but it can easily be replaced with the basic official one, depending on what modules you need (for example, Lets Encrypt with dns01-challenge requires modules for DNS providers). My example uses www.desec.io for that but Cloudflare, DuckDNS etc. are possible too.
docker-compose.yml
required/Caddyfile
required/configuration.yml
required/users_database.yml