Put a curved screen on everything, microwave your thanksgiving turkey, put EVERYTHING including hot dogs, ham, and olives in gelatin. Only useful things will have AI in them in the future and I have a hard time convincing the hardcore anti-ai crowd of that.
Docker is only useful in that many scenarios. Nowadays people make basic binaries like tar into a container, stating that it’s a platform agnostic solution. Sometimes some people are just incompetent and only know docker pull as the only solution.
Docker have many benefits - container meaning it can be more secure, easy to update and something that many overlook - a dockerfile with detailed intrusions on how to install that actually works if the container works - useful when wiki is not updated.
Another benefit is that the application owner can change infrastructure used without the user actually need to care. Example - Pihole v5 is backend dns + lighthttp for web + php in one single container. In version v6(beta) they have removed lighthttp and php and built in functionality into the core service. In my tests it went from 100 MB ram usage to 20 MB. They also changed the base from debian to alpine and the image size shrink a lot.
Next benefit - I am moving from x86 to arm for my home server. Docker itself will figure out what is the right architecture and pull that image.
Sure - Ansible exist as one attempt to combat the problem of installation instructions but is not as popular and thus the community is smaller. They may leave you in a bad state(it is not like containers were you can delete and start over fresh easily)
Then we have VM:s - but IMO they waste to many resources.
Containerize everything!
Crypto everything!
NFT everything!
Metaverse everything!
This too shall pass.
Put a curved screen on everything, microwave your thanksgiving turkey, put EVERYTHING including hot dogs, ham, and olives in gelatin. Only useful things will have AI in them in the future and I have a hard time convincing the hardcore anti-ai crowd of that.
Don’t forget microservices!
Docker: 😢
LXC – natively containerize an application (or multiple)
systemd-run – can natively limit CPU shares and RAM usage
Docker is only useful in that many scenarios. Nowadays people make basic binaries like
tar
into a container, stating that it’s a platform agnostic solution. Sometimes some people are just incompetent and only knowdocker pull
as the only solution.Docker have many benefits - container meaning it can be more secure, easy to update and something that many overlook - a dockerfile with detailed intrusions on how to install that actually works if the container works - useful when wiki is not updated.
Another benefit is that the application owner can change infrastructure used without the user actually need to care. Example - Pihole v5 is backend dns + lighthttp for web + php in one single container. In version v6(beta) they have removed lighthttp and php and built in functionality into the core service. In my tests it went from 100 MB ram usage to 20 MB. They also changed the base from debian to alpine and the image size shrink a lot.
Next benefit - I am moving from x86 to arm for my home server. Docker itself will figure out what is the right architecture and pull that image.
Sure - Ansible exist as one attempt to combat the problem of installation instructions but is not as popular and thus the community is smaller. They may leave you in a bad state(it is not like containers were you can delete and start over fresh easily) Then we have VM:s - but IMO they waste to many resources.