Researchers develop the fastest possible flow algorithm

by jeroenvlekon 6/29/2024, 11:12 AMwith 52 comments

by nabla9on 6/29/2024, 8:55 PM

The algorithm is near linear asymptotically at the limit when n -> inf.

In the end of video they tell there is no way that any implementation of their algorithm gets close to beating existing algorithms in the real world.

https://cacm.acm.org/research/almost-linear-time-algorithms-...

by okintheoryon 6/30/2024, 1:22 PM

Interestingly, the same guy also works on making 'theory-only' algorithms work well in practice [1]. But, it seems like that takes another 20 years -- [1] is building on a theory breakthrough from 2004 [2], but these algorithms are only starting to work in practice in 2024, IIUC. I guess that means there's hope for practical min-cost flow algorithms in 2044.

[1] https://arxiv.org/pdf/2303.00709

[2] https://arxiv.org/abs/cs/0310051

by c-smileon 6/29/2024, 6:59 PM

> Almost-Linear-Time Algorithm

From O(mn) to O(m) ... thus excluding N (number of vertices) from computation ...

Too good to be true?

by jpsteron 6/30/2024, 6:01 AM

> A glance at the raw figures shows just how far we have come: until the turn of the millennium, no algorithm managed to compute faster than m1.5, where m stands for the number of connections in a network that the computer has to calculate, and just reading the network data once takes m time. In 2004, the computing speed required to solve the problem was successfully reduced to m1.33. Using Kyng’s algorithm, the “additional” computing time required to reach the solution after reading the network data is now negligible.

TFA didn’t describe Kyng’s breakthrough in terms of this mscore it considers so important. What’s up with that?

by rowanG077on 6/30/2024, 1:28 AM

Sometimes I think we have lost the plot completely with complexity as a metric. Increasingly we are seeing algorithms which have optimized the complexity metric to an insane degree but which aren't actually useful.

by squarerootof-1on 6/29/2024, 6:32 PM

Where is the paper / code for this?

by ecstremaon 6/30/2024, 10:46 AM

Maybe someone could clarify something for me here:

o(n) seems like a stronger statement to me than O(n), since all o(n) algorithms are O(n), but the reverse is not true.

Also if o(n) applies to all n, however small, whereas O(n) applies only when n -> inf,

(From the Wikipedia page example: 2n = O(n) but 2n != o(n))

Then doesn’t that means this algorithm should be applicable to even small n’s? Then it would be the opposite of a galactic algorithm, as someone above suggested, wouldn’t it?

Or am I missing something?

by ziofillon 6/30/2024, 12:16 AM

damn you constant factors [shakes fist in the air]

by Sniffnoyon 6/29/2024, 7:23 PM

The abstract just says the time is m^(1+o(1))... anyone know if a more specific bound is stated anywhere?

by imtringuedon 6/30/2024, 8:30 AM

I was hoping for some kind of evolutionary algorithm. Giving up optimality in exchange for being able to solve problem instances with billions of items would be worth it.

by josefrichteron 6/30/2024, 6:22 AM

I don’t want to spam, but I’ve been using rome2rio website/app to find complex connections. They’re definitely not using this algorithm, but I’ve always been fascinated that you get complex results almost immediately. I don’t know how they do it, but for me it’s one of the most fascinating works on the internet. Great job. [I’m not affiliated with them in any way]

by I_am_tiberiuson 6/30/2024, 2:19 PM

Can this be used to improve the Bitcoin Lightning Network?

by imvetrion 6/30/2024, 5:25 AM

My words.

Solving a problem for computational efficiency is pointless.

Wy

Take a look at AI neural networks where they blast computational resources.

May be

One day this might help.

Reply to myself

Appreciate this post. And get back to writing.

Appreciation

Out of so many other less interesting post, this post caught my attention and nowhere it spoke about how it works, most importantly why it is needed.