Somewhat. I'd think it unrealistic to expect being up to date on everything though if one is working within a specific area, there may be an expectation to stay up to date within that realm of knowledge. This does set dangerous precedents since one could accumulate knowledge on various systems and be expected to keep track of all of it in their head. I doubt there is an "industry standard" given the range of different demands from different developers. If a developer is working in a Research & Development lab, then I'd suspect lots of time would be spent exploring technologies that is different than developers in an IT department that may work on customizing software either developed in-house or bought off-the-shelf. I can remember getting training on some software or some time to experiment with new stuff that was down this route but I'm not sure how standard this would be.
No though there can be exceptions here. My experience is that there will be some time spent developing skills though this is mostly on an "as needed" basis and thus it isn't something easy to plan. If a bug comes up in a part of a code base I don't know, then I'll spend some time learning that code base. Similarly, there may be slow times where developers could work on technical debt rather than new features as some companies may not want to do a release in December and thus the developers get that month to do the various housekeeping things that have been left until there was some bandwidth.