I see this often with both new and old developers, they have one way of doing a thing and when presented with a new problem they will fall back to what they are used to even if it's not the optimal solution. It will probably work if you bruteforce it into your usual patterns but sometimes, a different approach is much easier to implement and maintain as long as you are willing to learn it, and more importantly - know it exists in the first place.
On a less abstract level, I guess my question is - how would I go around learning about different design patterns and approaches to problem solving if I don't know about their existence in the first place? Is it just a matter of proactive learning and I should know all of them in advance, as well as their uses?
Let's for example say I need to create a system for inserting a large amount of data from files into the db, or you need to create some service with many scheduled tasks, or an user authentication system. Before you sit down and start developing those the way you usually do, what kind of steps could you take to learn a potentially better way of doing it?
It's more like languages evolved to incorporate the most common idioms and patterns of their ancestors. ASM abstracted common binary sequences. C abstracted common ASM control structures and call stacks. Java leaned hard on object orientation to enable compositional and inheritence-based patterns widely used in C and early OO languages. Python baselines a lot of those patterns, and makes things like the Null Object pattern unnecessary.