Deploy getML in Seconds & Run Your First Model in Minutes 🐳
This is one of our recent post from LinkedIn. Follow us on LinkedIn to stay updated with regular insights and news.
Remember when "requires Docker to run" felt intimidating?
I guess I’ve been there too. But fast-forward a decade, and now even my dev environment runs in a container. Why? Because speed, consistency, and simplicity matter.
And to make testing getML easier, we've released a minimal getML image on DockerHub and included a one-line Docker Compose service in our most recent 1.5 release:
curl -L https://raw.githubusercontent.com/getml/getml-community/ \
1.5.0/runtime/docker-compose.yml | docker-compose -f - up
This is all you need to spin up the getML engine — no cloning repositories, no dependency headaches. The getML Python API (pip install getml) in your environment will seamlessly communicate with the getML Compose service. This works across Windows, Linux, and macOS, giving you instant automated feature engineering on relational and time-series data—anywhere. Check out our docker install guide for details.
Why getML? Manual feature engineering is slow, tedious, and prone to errors. getML automates this process, evaluating billions of features to accelerate your workflow. Ready to see it in action? Explore our application examples for inspiration.
Docker, relational learning, and minimal setup - what more could you ask for?
#mlops #docker #getml #relationaldata #featureengineering