11

I had an argument yesterday with one of my colleagues. He (a business-analyst, previously a programmer) thinks that he should be aware of the technology used to implement the system, so he can take better design decisions. In my opinion (I am a programmer), an analysis should not be coupled in any way to the technology and I believe that a good analyst can make a great design without worrying about the implementation details.

Am I right to think that way? Are there any reasons why a business analyst would need to know the technology used to implement the system?

EDIT : I believe that I used the wrong term when saying business analyst. Maybe I meant architect, or system analyst. I am not used to these terms. I meant something like architect or system analyst if you prefer.

Thank you everyone for your awesome answers! I am not very experienced yet and I am glad you opened my eyes on this.

Thomas Owens
  • 79,623
  • 18
  • 192
  • 283
marco-fiset
  • 8,721
  • 9
  • 35
  • 46
  • 8
    Did you ask him for an example of when it would make a difference? – Karl Bielefeldt Apr 20 '12 at 18:03
  • He did not give me an example, but we are using a wide range of technologies going from old AS/400 systems, some Delphi and then .Net for anything new. But I still think that if you design something that's to be implemented in RPG, you will design it the same way in C# using separation of concerns, and a proper layer for business logic, etc – marco-fiset Apr 20 '12 at 18:11
  • He would need to know as much as the user. AS/400 vs a web app is a detail he would need. – coder Apr 20 '12 at 18:35
  • @JPatrick : Can you explain to me why it would be different : You get data from a database, so you have a data access layer. Then you have some business rules that have their own layer. Maybe you have some custom services, then you have to present something to the user. Where are the differences at such a high level ? – marco-fiset Apr 20 '12 at 19:31
  • 2
    The difference would be in the "then you have to present something to the user" part. The business analyst would need to know what type of user interface is being used and what options are available. There are tools out there that business analysts can use which essentially let them define what should functionally happen when a user does x (such as click a button). If the platform doesn't have a button (green screen) then that is a useful piece of information. – coder Apr 20 '12 at 20:08
  • 1
    @marcof: The ideal user interface would be where the user thinks a thought, and what they wanted done is simply done. Anything short of that is crippled by the technology limitations we have at our disposal, so of course the BA needs to understand what context they can design the system in. – gahooa Apr 21 '12 at 01:58

9 Answers9

18

There are certainly cases where it makes sense for a business analyst to understand the technology at least well enough to understand where it makes sense to question a business user about how important a particular feature would be. For example, if the business is accustomed to the behavior of a fat client application while the new application is going to be web-based, it is likely that there will be many "requirements" that would be trivial in a fat client but relatively difficult with a web-based application. If the business analyst understands whether a request from the business is going to be trivial for the development team or whether it is going to involve 20 hours of AJAX development, they can figure out whether it makes sense to just write down the requirement or whether it makes sense to engage the business in exploring alternatives.

For any given project, there are likely a large number of sets of requirements that would in reality satisfy the business by making various sorts of trade-offs. The more understanding the business analyst has about what trade-offs they are making, the more likely that they're going to deliver a set of requirements that maximizes the benefit to the business while minimizing the cost.

Justin Cave
  • 12,691
  • 3
  • 44
  • 53
  • 4
    +1 for the "maximizes the benefit to the business while minimizing the cost". That cannot be done without the BA understanding the technology. The BA's job to to understand more technolgy than the programmer, at a higher level. – mattnz Apr 20 '12 at 20:47
  • As an addition, It's not the requirements that should be changed, but the constraints affecting the implementation of those requirements. Just because the business cannot get what they want does not mean they should stop wanting it, though it does force them to rationalise what they can have *now*. e.g. Having a bad job now doesn't stop me wanting a better one, it just cant be achieved with the current constraints. The important thing is it opens the opportunity that if a constraint is removed, the requirement can now be satisfied. If you change the requirement for the now, you lose it forever – BiGXERO Jan 13 '14 at 11:18
8

Having worked both sides of this issue I have to agree with the Analyst. I have seen some spectacularly poor designs resulting from lack of understanding of the capabilities of the technology. In some cases, it has been a result of taking marketing hype as truth. In general, the problem has been generating specifications which don't match the technical capabilities.

The analyst should be specifying What needs to be done, When, and by Whom. They should know Why it is being done. Development priority should be more dependent on the Why than the other factors. The design and development team need to handle the How. In order to develop cost effective systems, the analysts need to specify what needs to be done in terms that don't push the boundaries of the available technology.

Pushing the boundaries can increase costs in a number of ways, but in some cases may have a significant return. Some of the cost factors are:

  • Experimentation may be required to develop a working solution;
  • New employees or consultants with specialized knowledge may need to be acquired;
  • Training on the new technology may be needed;
  • Development tends to be slower and bug rates higher; and
  • Extra efforts may delay simpler solutions which have more immediate value.
BillThor
  • 6,232
  • 17
  • 17
6

If the technology that will be used is known it should be taken into consideration by analysts when creating the design. Different technologies do things differently and a design that doesn't take into account those differences is going to have problems.

However, business analysts shouldn't care about what technology is used, their job is to gather business rules and make them understandable to the technical team. Systems analysts/architects/designers or any other name they may be given should know the technologies being used and design around them because they should be the ones doing actual design, not business analysts.

Ryathal
  • 13,317
  • 1
  • 33
  • 48
6

I believe there is a point between the two lines of thought that is probably more realistic. While a high-level design might be best when kept technology agnostic, there must be a consideration of known real-world constraints and requirements that should be incorporated into the design. What level is this design? Do you have sufficient requirements? How flexible is the environment? Is management invested in a specific technical direction?

Are there no operational parameters that drive you in a specific direction? Do you have a broad array of resources capable of implementing a solution in any technology stack? Are there interoperability issues requiring access to other systems?

Answers to these questions are needed before you can definitively say whether the technology should be a part of the equation or whether the design should drive the technology selection.

Given no constraints and being a very high-level design, I might agree with your thinking that the design be truly agnostic. However, in my 20+ years of experience, I've rarely been in a situation where there were not any constraints that limited my choices -- and which drove my design toward specific technologies or technology families.

3

The ideal user interface would be where the user thinks a thought, and what they wanted done is simply done. Anything short of that is crippled by the technology limitations we have at our disposal, so of course the BA needs to understand what context they can design the system in.

gahooa
  • 2,111
  • 12
  • 16
2

Different technologies can have very different cost and efficiency structures for solving a given problem. These costs can include things such as hiring costs in the local area, energy and cooling costs for specific systems, existing code and existing equipment reuse possibilities, etc., etc. So, yes, perhaps one can ignore these contraints and details of specific technologies if one is working on a project where cost and efficiency are not anywhere near as important as other considerations (such as in aviation safety, nuclear plant control, medical implants, etc.). But for most business situations, management might care about the cost structure of the potential solutions versus the benefits of the system implementation.

hotpaw2
  • 7,938
  • 4
  • 21
  • 47
1

The business analyst should know what kind of application that we are developing like *Web application / Console application / Mobile application / Reporting application etc* so that she can better come up with a nice set of features for the application or pushing back on the user on impossible expectations like 3rd level nested drag and drop (e.g) .

He/She does not need to be aware of which technology like Java/C#/Python/SQL etc.

java_mouse
  • 2,627
  • 15
  • 23
1

The analysis process itself needs to be entirely technology-agnostic. When you are researching the client and its needs, you need to do so with a completely open mind. The other side of the coin however, is that the analyst is often asked to provide recommendations and may also be required to handle system architecture also. This is an entirely different facet of the role in which a wider understanding of the available technologies is crucial, as it can make a huge difference to the customer not only in terms of the ability to get a project off the ground, but also in terms of the long term needs of the customer, and the sustainability of the project itself.

While it's true that the larger part of designing software is essentially the same regardless of the technology used, there are always areas where the design will be influenced by the choice of technology. Platform choices may influence language and API choices, while availability of expertise, support, and even cost will also have an impact on the choices made. So from one perspective, part of your position is justified in that the actual analysis should be conducted without the influence of any specific technology, however using the analysis to determine a design will always require a broader technology knowledge, so that the analyst can make recommendations which will allow the application of designs intended to meet the customer's needs.

S.Robins
  • 11,385
  • 2
  • 36
  • 52
0

Each technology has limits and constraints, therefore it makes some sense for an analyst to consider those limits. On the other hand, an analyst who know .net well, but hasn't seen Java since the late nineties, will most likely design a .net solution - using .net terminology and design patterns - even if Java (or RoR etc.) would better fit the problem. It's relatively difficult to implement such a design in another technology later.

Therefore, I think an analyst should be agnostic when the technology hasn't been selected yet, but experienced in those cases where the choice has already been made.

user281377
  • 28,352
  • 5
  • 75
  • 130
  • Aren't design patterns language-agnostic ? – marco-fiset Apr 20 '12 at 18:18
  • 2
    They are usually not tied to a specific language, but some technology stacks may make them easier to implement than others. A design made with ASP.net MVC in mind might be cumbersome to implement in plain PHP or Oracle Application Express. – user281377 Apr 20 '12 at 18:43