Show HN: Truss – Serve any ML model without boilerplate code

by philipkielyon 7/29/2022, 3:06 PMwith 9 comments

by 0xferruccioon 7/29/2022, 6:55 PM

This looks promising. It feels like for non ML engineers it’s very hard to figure out how to use models as part of vanilla CRUD codebase.

For instance in a Rails app the ML model services would probably be served as a completely external service API generated with something like Truss wrapped in a service class that just exposes the outputs and handles errors/input validation!

by ricklamerson 7/29/2022, 5:28 PM

In this category I’m a big fan of https://github.com/bentoml/BentoML

What I like about it is their idiomatic developer experience. It reminds me of other Pythonic frameworks like Flask and Django in a good way.

I have no affiliation with them whatsoever, just an admirer.

by d136oon 7/29/2022, 6:27 PM

Superb product and team.

Worth looking into if you’ve done any engineering work around deploying ML models as a or within a service.

by gorg93on 7/29/2022, 4:39 PM

Looks interesting, what if I need to write some logic (pre/post prediction) in the prediction server?

by dweinuson 7/29/2022, 11:55 PM

Looks great! What is the argument to use this over MLFlow model packaging and serving?

by jphowardon 7/29/2022, 7:09 PM

This is likely to share its name with the next Prime Minister of the UK...