tag:blogger.com,1999:blog-4446292666398344382.post1796347135696574452..comments2024-01-24T07:14:43.438-08:00Comments on Machined Learnings: Productivity is about not waitingPaul Mineirohttp://www.blogger.com/profile/05439062526157173163noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-4446292666398344382.post-51214560776713453702013-06-08T08:01:19.262-07:002013-06-08T08:01:19.262-07:00Re: Sublinear debugging. There is something subst...Re: Sublinear debugging. There is something substantive that I neglected to mention above, namely, avoid techniques during experimentation that do not admit this kind of treatment.<br /><br />There's often a choice between optimization methods with poor asymptotic convergence but cheap steps (e.g., SGD), and optimization methods with fast asymptotic convergence but expensive steps (e.g., quasi-Newton). The former tend to provide good intermediate information, whereas the latter can look like they are making little progress initially even though they end up somewhere better. So you would use the former during experimentation and the latter for the finished product.<br /><br />Paul Mineirohttps://www.blogger.com/profile/05439062526157173163noreply@blogger.comtag:blogger.com,1999:blog-4446292666398344382.post-84102509596035062822013-06-08T03:30:25.094-07:002013-06-08T03:30:25.094-07:00Agree completely. The first ("Use Less Data&...Agree completely. The first ("Use Less Data") is at my top of the list too and cannot be recommended enough.<br /><br />Isn't "Sublinear Debugging" just a fancy term for the judicious use of print statements?<br /><br />When dealing with a new data set, I start with the simple or obvious features ('the low hanging fruit') to get everything working and then go back to optimize the feature engineering.<br /><br />Very useful article. Best ...<br /><br />Unknownhttps://www.blogger.com/profile/16570083132483784714noreply@blogger.com