2000-2003, both are pre-historic. We have neural networks now to do things like upscaling and colorization.
rahen 7 minutes ago [-]
I see it the same way I see 'Applied Cryptography'. It’s old C code, but it helps you understand how things work under the hood far better than a modern black box ever could. And in the end, you become better at cryptography than you would by only reading modern, abstracted code.
vincenthwt 4 hours ago [-]
Yes, those methods are old, but they’re explainable and much easier to debug or improve compared to the black-box nature of neural networks. They’re still useful in many cases.
earthnail 4 hours ago [-]
Only partially. The chapters on edge detection, for example, only have historic value at this point. A tiny NN can learn edges much better (which was the claim to fame of AlexNet, basically).
grumbelbart2 3 hours ago [-]
That absolutely depends on the application. "Classic" (i.e. non-NN) methods are still very strong in industrial machine vision applications, mostly due to their momentum, explainability / trust, and performance / costs. Why use an expensive NPU if you can do the same thing in 0.1 ms on an embedded ARM.
frankie_t 16 minutes ago [-]
I wonder if doing classical processing of real-time data as a pre-phase before you feed into NN could be beneficial?
fsloth 3 hours ago [-]
"The chapters on edge detection, for example, only have historic value at this point"
Are there simpler, faster and better edge detection algorithms that are not using neural nets?
4gotunameagain 4 hours ago [-]
Classical CV algorithms are always preferred over NNs in every safety critical application.
Except Self driving cars, and we all see how that's going.
3 hours ago [-]
Rendered at 11:15:19 GMT+0000 (Coordinated Universal Time) with Vercel.
https://www.spinroot.com/pico
Are there simpler, faster and better edge detection algorithms that are not using neural nets?
Except Self driving cars, and we all see how that's going.