We are Sergiy, Davit and Jason, founders of Snark AI (https://snark.ai). We provide low-cost GPUs for Deep Learning training and deployment on semi-decentralized servers.

We started Snark AI during our PhD programs at Princeton University. As deep learning researchers we always experienced lack of GPU resources.

Renting out GPUs on the cloud didn’t fit in our budget, and purchasing GPU cards was difficult — at that time, so many GPUs were being taken away by the crypto-miners. Then we found out that GPU mining profits lag far behind public cloud GPU prices.

On top of that, we figured out that there’s a way to run Neural Network inference and crypto-mining simultaneously without hurting mining hash rate. This observation is a little counterintuitive, but it turns out that anti-asic hashing algorithms are designed to be extremely memory intensive, which leaves a good chunk of the CUDA cores idle.

We can utilize the leftover compute power to run Neural Network inference extremely cost efficiently, which could be a life savior for large-scale inference tasks. http://snark.ai/blog We would love to get your feedback, to understand how was the experience for training Deep Networks through our platform and then deploying. Read more from news.ycombinator.com…

thumbnail courtesy of ycombinator.com