There are many variables that should be considered when deciding to add new technologies to a legacy application.
First, what is the long term plan for this application? Is there a plan with money and a date to retire it? If so, does the new technology being introduced relate to whatever is replacing the old app? If so, great, go ahead. If not, then why are you introducing this technology? Old apps aren't playgrounds -- they're usually serving a critical purpose in an organization (they're often the first apps that were built at a company and consequently are critical to the company's financial health).
Second, what is the learning curve of the new technology? Are your people ready for it? For example, let's say you've got an old application written in COBOL/IMS/CICS. It's rather unlikely that you'll be able to quickly transition the people who are working on that to say HTML5/CSS3/Javascript/JSON/REST/NoSQL in a short period of time. So, what's your workforce strategy? Do you have one? It can't be "they'll learn it as they go". That's irresponsible and likely to result in staff churn.
Third, do you know where this new technology is in its Diffusion of Innovation stage? Probably the WORST thing you can do to a legacy application is burden it by grafting it with a "new" technology that's already at the late majority or laggard stage of adoption. Why? Because two years from now it'll have TWO old dying technologies under its umbrella, so you actually made it worse. Try to restrict your technology introductions to things at the early adopter or early majority stage.
Fourth, how many people are HONESTLY contributing to the health and welfare of the new technology? If you go to GitHub for example, how frequent are commits being made, and how substantial are those commits? Again, don't burden a legacy app with a dying technology just because it's "new to you".
There are probably many other things to consider, but those are some that come to mind based on my years as an architect making those kinds of decisions.