this post was submitted on 09 Nov 2023
21 points (81.8% liked)
Programming
17426 readers
64 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'd say since you're a beginner, it's much better to try to implement your regression functions and any necessary helper functions (train/test split etc...) yourself in the beginning. Learn the necessary linear algebra and quadratic programming and try to implement linear regression, logistic regression and SVMs using only
numpy
andcvxpy
.Once you get the hang of it, you can jump straight into
sklearn
and be confident that you understand sort of what those "blackboxes" really do and that will also help you a lot with troubleshooting.For neural networks and deep learning,
pytorch
is imposing itself as an industry standard right now. Look up "adjoint automatic differentiation" ("backpropagation" doesn't do it any justice aspytorch
instead implements a very general dynamic AAD) and you'll understand the "magic" behind the gradients thatpytorch
gives you. Karpathy's YouTube tutorials are really good to get an intro to AAD/autodiff in the context of deep learning.So I should learn sklearn first before pytorch to understand the basics?
Linear and logistic regression are much easier (and less error prone) to implement from scratch than neural network training with backpropagation.
That way you can still follow the progression I suggested: implement those regressions by hand using numpy -> compare against (and appreciate) sklearn -> implement SVMs by hand using cvxpy -> appreciate sklearn again.
If you get the hang of "classical" ML, then deep learning becomes easy as it's still machine learning, just with more complicated models and no closed-form solutions.
Aight thanks.