As the winter months come to an end, many gardeners start to wonder when they should begin tending to their plants. While some may assume that gardening tasks should only be carried out during spring and summer, there are actually several reasons why you should consider tending your garden plants from January onwards.