Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your answer somehow suggests that solving such problems is merely "recognizing which algorithms to apply".. which is another way of saying that they are more "pattern matchers" than "true reasoners". I would think, on the contrary, that these problems (at least the tougher ones that are coming in about two weeks) require more than pattern matching, but I'm not sure exactly what are my thoughts on that.


Consider ignoring the reasoning for how/why LLM's can do this sort of thing which doesn't matter as much when you simply try it out. I wouldn't be surprised if many of these problems are in fact solved by LLM's. Just from my experience using them to solve relatively novel issues in my day-to-day. There will of course be mistakes and hallucinations, but in a proper dialogue with a motivated programmer, I bet it works >50% of the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: