Lo-Fi Machine Learning May 23rd 2023 ====================== A coworker and I have started working on a project that gets me back to my scrappy, DIY, low-power cs interests: lo-fi machine learning. God I wish I'd come up with that title but, unfortunately, I didn't. It's an outgrowith of stuff we'd been talking about for awhile though. Questions like * is it possible to do interesting things in machine learning that don't require powerful hardware? * can you teach ML without having to teach a framework with a bunch of dependencies that will be obsolete in six months? * can ML be separated from dataset maximalism and brought to something more pro-social and small scale? And we don't entirely know yet but part of the hope is that by starting with a stripped down library that's not built for Big Machine Learning but is, instead, built around being understandable and extensible and implements as much from scratch as possible: everything from markov processes to baby's first automatic differentiation library It doesn't have to be super efficient if we're focusing on small datasets and intentionall janky uses of ML for artistic purposes. So I've been brainstorming a lot of demos and collecting resources the past few days and we're going to be writing a grant to help support our work as faculty members at the college basically all of next month. We'll see what happens!