5

At my new place of employment we have a Intranet solution containing multiple web applications hosted in IIS. Outside of the development department other departments either source or create other applications / reports that deep link into these web applications.

Unfortunately, I have no direct control over these potential applications and some get created without our knowledge. This causes a problem if we need to do some re-structuring or re-factoring.

I was wondering if anyone had an approach where all inbound links regardless of origin had to be registered - i.e like a link registry or some other approach on how to deal with this. This could then be used so that the impact of any re-organisation of the structure of the web application could be understood.

Any help much appreciated

lostinwpf
  • 173
  • 3

2 Answers2

5

There are at least two technical methods to deal with the incoming links:

  1. Instead of registration, consider logging and analyzing incoming traffic to determine the source of the links. Good log information will allow you to track the incoming links back to their source. In the event of restructuring, look for an increase in the number of 404 errors as a starting point for finding out-of-date links. Logging and analysis tools such as Webalizer and awstats are good places to start.

  2. If you cannot use logging or find it ineffective, you can develop a proxy through which all incoming links must pass. Valid incoming links should include a registration key (which you assign by department or other factor). Any incoming request which does not include the registration key should be redirected to an information page which explains how to obtain a key. You can associate the keys with specific departments or relevant sources in your database, and can use that database to reach out to the external linkers when you change the link targets.

However, consider whether your business model need to change. If you are not responsible for creating the external links, should it be your responsibility to guard against their failure? Your business may find more value in either:

  1. Requiring the link creators to take responsibility for the links that they generate (ie. they must reach out to you when something breaks), or

  2. Restricting link creation to IIS staff.

Either of these options will allow you to have a better chance to avoid failure in the future.

George Cummins
  • 285
  • 1
  • 8
  • Thanks, I do currently have LogParser to do some rudimentary logging so might improve this as you say. Unfortunately, 404's might be too late so will need to catch in advance. I like the proxy idea but am intrigued as to implementation methods. – lostinwpf Mar 11 '14 at 07:55
0

There are things you can do, and they all come down to being able to tell a legitimate deep link from an illegitimate one, for whatever definition of legitimate you've decided on.

It could be a simple as having your approved application set something up in the session upon hitting the proper start page, and anyone without that value set must be deep linking. You could decide that external application that want to pass this check need to get a value from the server, which then allows you to track all the requests that get made with that ID, or else you don't serve the page.

Note that this prevents legitimate users from deep linking as well, unless they first go to the start page. It's also pretty easy to hack around, if your business issues run deep enough for that to be an issue.

psr
  • 12,846
  • 5
  • 39
  • 67