54

I'm told that software is everywhere and therefore used in other domains. My question is if you're a software engineer working on software for lawyers or software for biologist when do you actually get the time to learn about the other domain you're impacting?

How can you make software for lawyers if you're not familiar with the jargon?

UPDATE : I see comparison made with journalists. I think that journalism is not a good example. Often the journalist writes on a topic he/she does not understand and it comes of as superficial (sometimes even wrong). Software is much more complex.

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
WindBreeze
  • 687
  • 5
  • 6
  • 15
    You don't need to be a biologist to write software for biologists. You just need to be able to read the requirements specification. You may need to know a bit about biology to write that specification, of course, but that is what a business analyst is for. – John Wu Jun 03 '20 at 19:12
  • 2
    So the business analyst is the one who takes the time to understand the domain knowledge? – WindBreeze Jun 03 '20 at 19:12
  • 46
    If you have one. More often than not, it is the developer who is gathering the requirements. – Robert Harvey Jun 03 '20 at 19:14
  • 1
    So he needs to understand the domain to the write the specification which mean that you are expected to take the time to understand the domain? – WindBreeze Jun 03 '20 at 19:19
  • 29
    You don't need *perfect knowledge* of the domain. For example, you don't have to know what Miranda Rights are to put a checkbox on a form that says "This person was administered his Miranda Rights." Though it certainly helps to have a working knowledge of the domain, complete knowledge is neither desired or required. You are a software developer with some legal knowledge, not a lawyer with some software development knowledge, and that's how it should be. – Robert Harvey Jun 03 '20 at 19:34
  • 3
    @WindBreeze, the average person who commissions software is lucky if he gets a software developer who understands the software domain, let alone the business domain! The reality is that many software developers (and analysts) don't have the necessary domain experience - they muddle through - and both poor quality and disasters abound. – Steve Jun 03 '20 at 19:56
  • @RobertHarvey, what you say is enormously dangerous. You don't have to be a lawyer to write out a given list of Miranda rights on a paper form - you can just do what you're told mindlessly. You do however have to be a lawyer to determine what that list says in the first place, to determine the evidential value of the form, and you have to be a police officer to understand the appropriate ways and settings in which computer data can be collected. These may not always be jobs for the code monkey, but they are surely functions that have to be performed by *somebody* in the development process. – Steve Jun 03 '20 at 20:13
  • 16
    @Steve: You're arguing with a straw man; I never made that argument. I wasn't referring to paper forms; I was referring to application forms. It is the lawyer or his clerk who will check the box, not the developer. – Robert Harvey Jun 03 '20 at 21:10
  • @RobertHarvey, I realise it's not the developer who will check the box. The question is who designs that box-checking process in all the necessary respects, and for what purposes. In virtually all but the most deskilled development positions, the developer may have to contribute some if not all of that design. It only takes a modicum of commercial experience, to find software that was clearly designed by someone who had not the slightest idea. And the difference between paper and computer is not so radical. – Steve Jun 03 '20 at 23:21
  • 17
    As much as software developers instinctively want to know *everything* before writing the first line of code, that platonic ideal is neither practical nor possible. – Robert Harvey Jun 03 '20 at 23:25
  • 2
    @RobertHarvey, it certainly doesn't always have to be "everything", but it may be a considerable amount, at least concerning the *administrative* aspects of a lawyer's job. You might not need to know the difference between a writ of novel disseisin and a writ of fieri facias (to use antiquated jargon), but if you're dealing with a lawyer's workflow, you probably do need to know what a writ is, what sorts of records it involves keeping, and how the lawyer interacts with those records (which in the absence of an existing computer system, may be highly unstructured and require *you* to redesign). – Steve Jun 04 '20 at 00:17
  • 4
    @RobertHarvey: _"it is the developer who is gathering the requirements"_ It's always the business analyst who gathers the business requirements. Whether this person also performs the role of software developer is a separate consideration. Even with overlapping roles, the separation of these roles and their concerns should be maintained as it otherwise muddies the water. (Similarly, wouldn't you agree that you need to have separate layers in a project even if the same developer is developing all of those layers? It's the same principle at play) – Flater Jun 04 '20 at 09:38
  • 4
    @Flater: whenever someone talks of a "business analyst", I get a picture in my mind of overpaid people with suit & tie who are missing experience in development and suffering from the Dunning-Kruger effect so they become a communication barrier. This may be my personal problem, of course. I guess where we can agree is that it makes sense to separate "requirements gathering" as an activity from other activities, especially for complex domains. But I don't think Robert has written something different. – Doc Brown Jun 04 '20 at 11:33
  • 4
    Regarding your "update", that's actually a great example. A software engineer creating software about something they don't understand leads to bugs and crap software. That's why journalists have editors and engineers get domain experts to review the function of their software – Mars Jun 04 '20 at 11:38
  • 1
    @DocBrown: If you separate the activities/responsibility, then "the developer" doesn't gather the requirements, the business analyst does (since this activity is literally business analysis). I'm a software developer, but if I'm making my dinner I'm performing the role of a cook, not that of a software developer. More on topic, I similarly separate my duties as an architect from those as a developer or as a coach. Different role, different focus. Just because the same person does more than one activity does not mean that all their activities should be lumped together. – Flater Jun 04 '20 at 11:57
  • 3
    @Flater: I don't deny this, but Robert was speaking about people, not roles. And separating these roles too strictly leads to the idea of an organization of where it is "braindead best practice" to assign the roles to different people. There may be cases where this works well, but I have too often seen this model fail. – Doc Brown Jun 04 '20 at 12:11
  • 11
    "Often the journalist writes on a topic he/she does not understand and it comes of as superficial (sometimes even wrong)" That actually does sound exactly like what many software developers do. – kapex Jun 04 '20 at 12:43
  • 8
    A lot of ink has been spilled here already, but the answer to your question is simply that we don't know how. If we did, we'd make better software instead of having to listen to endless complaints at the doctor's office, retail stores, etc about how much their software sucks. Let us know if you figure it out. – Jared Smith Jun 04 '20 at 13:08
  • 2
    @JaredSmith, speak for yourself about not knowing the answer. Most crap software is not the consequence of trying and failing to advance the state of the art, it's the consequence of those with insufficient education and experience frequently being tasked with responsibilities way beyond their competence, and a consequence of an industry that systematically underinvests in such education and experience. – Steve Jun 04 '20 at 14:23
  • 3
    @Steve no, that's *another*, different reason software sucks, in addition to the problem of not being able to easily extract domain knowledge. Being better at software engineering (and I certainly agree about too much being dumped on the unready and underprepared) does not automatically make you better at understanding *completely unrelated domains*. – Jared Smith Jun 04 '20 at 16:41
  • @JaredSmith, I don't make the fundamental distinction that you do between software and domains. The process of writing software *in the round* is really all about either extracting or (more often) creating new domain knowledge. – Steve Jun 04 '20 at 17:51
  • 4
    I'm appalled at software engineers that just say, "read the requirements doc!", and leave it at that. Everyone working on the product should have an understanding of what they're doing in the bigger picture. While huge and complex products will have layers to this, it's still important that everyone know what's going on or you'll end up with garbage software. It is (or should be) the job of the developer to *interpret* the requirements, not implement them blindly or literally. Interpretation requires a common understanding, which is made through perpetual collaboration and learning. – Brad Jun 05 '20 at 01:22
  • @Brad I agree, but to some extent this is sadly not an option everywhere, sometimes the client just want this ASAP and probably won't bother explaining anything specific , heck even when you are asking crucial things for development process they just turn into ghost until a day before the deadline suddenly they show up asking the progress while adding weird stuff to their request. So you just have to wing it. This is from personal experience, sadly. – encryptoferia Jun 05 '20 at 01:28
  • 1
    @encryptoferia Yeah. :-( These days I don't take on clients like that. I tell folks up front that I look for a collaborative partnership. I know not everyone has the choice to turn down work. – Brad Jun 05 '20 at 02:11
  • 3
    The role of a software developer is to write down business rules in an executable form. I'm not a fan of business analysts because many of them just seem to write down business rules in non-executable form. I'm not sure why you're so keen to add that layer of indirection @Flater. I'd make an analogy of wrapping an API (the users) in another API that sometimes munges the results (the BA) in order to consume their data. I'd much rather a developer who can speak to and understand customers. – Wes Toleman Jun 05 '20 at 03:11

10 Answers10

63

Software is a knowledge-intensive area. And a big part of the software engineer's work is to extract the domain knowledge from the users and domain expert, abstract it, and transform it into implementable data structures and algorithms.

For example, the best introduction I ever got about legal principles and law was not from a lawyer or a law professor (I followed some courses), but from an AI researcher who worked on modelling legal concepts for an expert system (sorry, this was 30 years ago, and rule based expert systems seemed very promising at that time). His explanations were so crystal clear and logical...

So learning about the domain is part of the job and not something that you would do overnight outside the working hours. All you need is an open mind, and fearless questioning. Moreover, your knowledge will develop iteratively and incrementally exactly as the software you write (since the software embodies this knowledge): learning about requirements, enables you to model, design and implement something, to experiment with it, to exchange with users, and improve it again and again.

But caution: you also need to remain modest: it's not because you are able to design a flight system, that you can hope to replace the pilot and fly on your own ("don't try this at home") ;-)

Christophe
  • 74,672
  • 10
  • 115
  • 187
  • *"And a big part of the software engineer's work is to extract the domain knowledge from the users and domain expert"*. I think the OP is asking from the point of view of someone whose employer is precisely not setting aside six months for him to learn the basic elements of legal process and administration. – Steve Jun 03 '20 at 23:35
  • 35
    In my experience, if you are new to an employer, it can often take 3 to 6 months to get up to speed with their systems and business domain anyway. – Robert Harvey Jun 03 '20 at 23:36
  • @RobertHarvey, a relative of mine is a legal secretary (with a law degree) - from what I understand, at the beginning of her career in a typical small practice, it took her more than 6 months full-time just to grasp the full details of the paper filing system with a view to operating it competently (let alone with a view to regularising and computerising it). – Steve Jun 04 '20 at 00:33
  • 4
    @Steve I do not mention time here. My claim is not that this learning process is instant. The idea here, is only that it an SWE does not have to work in shifts (designing the day and learning in the middle of the night) nor go back to university to get an extra degree in a new field (although it is not necessarily a bad thing either) but that he’ll learn with the job. Then, he’ll benefit from his own domain to accelerate some kind of knowledge: classification, filing, sorting, pattern matching or process flows is something that no professional SWE has to learn from scratch ;-) – Christophe Jun 04 '20 at 06:30
  • 1
    @RobertHarvey exactly: the onboarding in a new team is not only about domain knowledge: a newcomer has to learn the systems in place and how they relate. Reading a domain model is not sufficient to instantly understand what took years to analyse. But software is still eating the world. SWE like the ancient romans, add new tools/knowledge to their mental package after each delivery when they see it’s effective. And this leads to innovative cross fertilisation: applying in one industry techniques discovered in another one. I may be an idealist but I really love this job :-) – Christophe Jun 04 '20 at 06:37
  • 7
    @RobertHarvey: I can also attest that simply _changing team_ within the same company requires a similar ramp-up period to get acquainted with the new domain (and codebase). – Matthieu M. Jun 04 '20 at 08:30
  • 1
    @Steve It's a good point, but then the argument is that *someone* on the team will have to be handling that competence, and they'll have to manage the knowledge transfer to the other employees. – deworde Jun 05 '20 at 10:55
24

Same applies to journalists. They write stories about many domains. What about graphic artists, too? Any occupation that works with other occupations has the same problem. You need to work with people who understand that domain: a domain expert.

Writers of software do not need to be experts, but they need access to experts. Those experts work with a person responsible for recording how the application should behave, and the problems it should solve. This person goes by many titles, but you will commonly here them referred to as a Business Analyst.

The business analyst might be a domain expert, but more likely the business analyst knows of one or more domain experts with whom they interact in order to gather requirements.

For example, I play the role of a business analyst on a project (among many other roles, but that's too big for this question). The application I gather requirements for serves the vocational rehabilitation industry (help people get and retain employment).

I am not a vocational rehab expert. I work with a number of people who are experts in this field. They actually do vocational rehab, so they tell me about the problems they have and we come up with software solutions. I organize the work so developers and testers can build the software without becoming vocational rehab expert themselves.


Addendum: As someone who writes software, your domain is software development. I would expect you to be a domain expert on developing software, but not an expert on the domain the software is written for.

UPDATE : I see comparison made with journalists. I think that journalism is not a good example. Often the journalist writes on a topic he/she does not understand and it comes of as superficial (sometimes even wrong).

The same thing happens when writing software too. When you lack access to a domain expert, the developer writes software about a topic he/she does not understand, and it comes off as superficial (sometimes even wrong).

Greg Burghardt
  • 34,276
  • 8
  • 63
  • 114
  • 2
    Thinking about it VFX artists must have an intuition about how physics work and people who draw must learn how object looks for real in the real world. There is often this step in those other profession where they take the time to study. I was wondering if it was the same for software engineering. – WindBreeze Jun 04 '20 at 02:39
  • 2
    A journalist doesn't need to go in depth compared to let's say a novelist writing a novel about a specific historical period. Now my question is software engineering closer in this aspect to the job of a journalist or the job of the novelist? – WindBreeze Jun 04 '20 at 02:40
  • 2
    @WindBreeze: VFX artists need to know physics, but not at the same level of depth as, say, an actual physicist. The VFX artist needs to know math formulas. The physicist creates and proves the formulas, and must know quantum mechanics as well, sprinkled with a little Special Relativity if things start moving to fast (literally). Therein lies the difference between developer and domain expert. – Greg Burghardt Jun 04 '20 at 16:48
  • 1
    Journalist, novelist. Same difference. Both consult with experts. A sci-fi novelist will likely consult with Astro physicists, but the novelist isn't required to prove that a super nova happens when the iron core of a star collapses, and the outer shell of the star bounces off the core. The novelist just wires "super nova" and "core collapse" without needing to provide proof. – Greg Burghardt Jun 04 '20 at 16:53
14

How can you make software for lawyers if you're not familiar with the jargon?

By first making Bad software for Lawyers

Like any symphony, any sport, any activity at all, you always start by being bad at it - even if you have some skill in a related area.

The trick is to be bad at it, find your mistakes, learn from them, refine yourself, and go again.

Eventually you won't be so bad at it.


So when is this supposed to happen? All the Time

Software development is a process of learning.

Some of that learning happens on the job...

  • Why didn't this file compile? Something about line 234.
  • Hey, Bob what is a FDHG? Oh, is that what it is?
  • Training day, so I'm getting certified for New Stack 2?

Some of that learning happens off the job...

  • Attending a forum like a presentation night or a conference
  • Reading blogs and articles written by other about this or that topic
  • Grabbing a text book, and reading...

You've already figured out that you have a deficit of knowledge, that you already know you will need, in order to make Good software for Lawyers.

The only way you are going to remedy this is by obtaining that knowledge.

  • Some of it will come from reading, talking, and practising.

  • Some of it will come from the school of hard knocks because of the mistakes you have made.

And all of it will have to be obtained by you.


How much you need does depend on the situation.

  • Sometimes it pays to be unfamiliar, and to learn as you collaboratively design.

  • Sometimes it pays to be familiar, and be able to quickly invalidate poor designs.

It might help to find out how much familiarity is expected of you. Perhaps the team needs you to be the unfamiliar one.


How quickly you go from making Bad software to making Good software in a given domain is entirely dependant on your ability to extract knowledge, and how much effort you put into it.

The same goes for any team.

Kain0_0
  • 15,888
  • 16
  • 37
8

Division of labor

A car is a machine whose function derives from chemistry, i.e. the combustion of fuel (chemistry). But the people who build cars on production lines are not chemists nor chemical engineers.

Someone else worked out the chemistry behind combustion and how to transfer it into motion, and designed the plans for a machine to harness that power. Those plans were then given to the production line workers, who are implementing the steps described in the plan, without it requiring them to understand the bigger picture of how it all comes together.

A car cannot operate without fuel, yet a car can be built according to specification without any fuel. Specification is the operative word here. For software developers, that's the requirements that are described in the functional analysis. It contains all the information that is necessary to know how to build the application (similar to the steps describing how to build a car).

That being said, it is true that car builders will usually have a higher-than-average understanding of how cars work as they are surrounded by the subject matter on a daily basis, but that doesn't mean that anything above a basic understanding is a necessity for their job.
Similarly, due to contextual business rules developers will generally acquire some understanding on how the field works, but that's a side effect from working the job, it's not a required skill to work the job.


Curiosity and osmosis

Back to the software engineering example, the same thing is happening here. Let's say you have biologist customers who want an application to track their inventory of DNA samples.

Right off the bat, software developers will generally omit field-specific details (in this case related to biology) to focus on the underlying (more reusable) principle. Most developers would very quickly identify this application as being structurally similar to other applications from completely different fields, e.g. a warehouse inventory system.

This actually proves the point that on the outset, you don't need field-specific information, as a lot of applications are structurally similar even if they are used in different fields. That's pretty much the core essence of what a developer does: finding the abstract and reusable logic/architecture that's not contextually unique.

However, then we get to the implementation details, and here there may be context-specific exceptions or rules. I'm no biologist, but let's just invent something and say that DNA samples that are more than a week older than another sample cannot be stored adjacent to one another.

Most of the time, the functional analysis will already cover for these rules, with pretty much the exact description I used just now: "DNA samples that are more than a week different in age cannot be stored adjacent to one another".
You don't know why that is the case, nor do you need to know. The rule as phrased in the analysis is enough information for you to implement the necessary logic that would prevent biologists (end users) from wrongly storing these kinds of samples adjacent to each other.

However, we're still humans who are curious about things we don't understand. That counts double for developers, as they tend to display character traits like seeking out puzzles and looking for answers.
It's very likely that when a developer is asked to implement this business rule, they're going to ask why that is the case. Not because it's necessary knowledge, but as a matter of casual conversation or personal curiosity.

Your question is build on the premise that this field-specific information is necessary, but it is not. It's simply something that you will generally accrue while working in the context of that field, due to random conversations you either overhear or are part of, and possibly some field-specific business logic that reveals how certain parts of a field work.


Imperfect requirements

There's one more thing to consider which I haven't really addressed yet. You cannot reasonably expect a functional analysis to be perfect. There are always going to be some mistakes or gaps in the document.

If we're talking about gaps in the custom business logic, then this is where having field-specific contextual knowledge can cover for those imperfections.

So you could argue that the quality of a functional analysis in inversely correlated to how much field-specific knowledge your developers should have. The better your analysis, the less your developers need to figure it out for themselves, and therefore don't need to have any real field-specific knowledge.

Anecdotally, as a consultant I've been sent to several development teams where they had a lacking development framework (most commonly in the analysis department), and the developers in those teams were often highly aware of the field in question and how the customer operates.

Conversely, when I was sent to customer who did have a well-rounded analysis/software spec, developers were generally able to focus on development itself and did not require (nor focused on) the field in question as much.

It's my observation that a lacking/bad analysis leads to a tighter coupling between a developer and the field of their end-user, simply to cover for the knowledge gap that the software requirements are supposed to fill.

A good functional analysis separates the developers from the contextual field as best as it can, leading to developers being able to shift more of their attention towards actual development. This cycles back to the division of labor that this answer started off with: car builders (software developers) shouldn't try to be chemical engineers (biologists). It's not what they're good at.

Flater
  • 44,596
  • 8
  • 88
  • 122
  • 4
    "You don't know why that is the case, nor do you need to know. The rule as phrased in the analysis is enough information for you to implement the necessary logic that would prevent biologists (end users) from wrongly storing these kinds of samples adjacent to each other." Richard Feynman, is his memoir "Los Alamos from Below" gives a lucid explanation of how such reasoning came close to causing a nuclear disaster at Oak Ridge. What the engineers reading "near" perceived was different from what the physicists meant. And it really couldn't be communicated without the (classified) physics. – John Doty Jun 04 '20 at 12:45
  • @JohnDoty: Had the engineers and the physicists been using ubiquitous language, the issue wouldn't arise. This related back to the "flawed analysis" section where I specifically point out that while contextual knowledge can cover for gaps in the communication between parties (= ubiquitous language), it should be a fall back, not the intended way of doing things. I can make a counterargument that if the engineers knew **nothing** about nuclear physics, they wouldn't have their own language/definitions which conflicted with those of the physicists as they'd solely rely on the instructions. – Flater Jun 04 '20 at 12:56
  • @JohnDoty: I'm not saying Feynman is wrong, but I am putting a bolded asterisk on the claim that contextual knowledge is "always" an improvement - since wrong assumptions and inferences generally stem from someone making their own (educated) inferences, which in turn stems from having to fill in the gaps that comes from an incomplete specification. There's a reason why calling someone an "armchair [job]" is used as a pejorative - Dunning Kruger leads them to make calls that they're not equipped to make (but they still do it because they don't know about the things that they don't know about. – Flater Jun 04 '20 at 13:01
  • @JohnDoty: Using my own example, maybe "adjacent" means something else to a DNA biologist (e.g. within 1m) than to any other person (i.e. not in the slot next to it). But if that is the case, this is where the need for ubiquitous language rears its head: the specifications should contain a dictionary with the DNA-biologist-definition of the word "adjacent" before it can be used in the specifications. The Oak Ridge disaster lacked this ubiquitous dictionary, specifically because the engineers had enough of their own contextual understanding to have their own (different) interpretation. – Flater Jun 04 '20 at 13:06
  • 3
    It isn't as simple as ubiquitous language. What was really needed was an understanding of neutrons and fission, so the engineers could *calculate* what "near" meant in a particular situation. Without that understanding, there's no reliable way to define it. – John Doty Jun 04 '20 at 13:17
  • @JohnDoty: If the definition of "near" requires calculation, explaining the calculation is still a ubiquitous definition. It doesn't have to be a static definition for it to be considered ubiquitous language. We're saying the same thing but with different labels. Necessary information should be explicitly expressed in the specifications (or the attachments to the specifications), so that the specifications become an understandable but independent package. Reasonable omissions can be made (e.g. don't explain that 1+1=2) but where you draw that line requires consideration by the analyst. – Flater Jun 04 '20 at 13:27
  • @JohnDoty: There's also a difference since both parties in your example were dealing with nuclear physics - therefore there is an expectation that they both have some knowledge in the field. But a software developer and a biologist have _zero_ shared knowledge base. The developers do not interact with the DNA, the biologists don't touch the source code. Your example seems to better describe how a _technical analyst_ (nuclear physicist) communicates with a developer (nuclear engineer), as there is an expectation of them sharing a common knowledge base (to some unspecified degree) – Flater Jun 04 '20 at 13:29
  • 2
    Except that the engineers at Oak Ridge were *not* dealing with nuclear physics. They had no knowledge of it because it was classified and they were not cleared. They were, as far as they understood, constructing a chemical plant. There was no shared knowledge base to use to prevent a criticality accident, an occurrence the engineers couldn't even conceptualize. – John Doty Jun 04 '20 at 13:44
  • The problem with the car production analogy is that software is not mostly physical assembly - it is the conception and articulation of the design. In effect, as soon as the engineer has done the drawing, the car itself is done. You can divide between the developer and analyst (the latter being freed up from day-to-day coding), but practically speaking the analyst then becomes the key person who has to understand *both* the business domain and software development. (1/2) – Steve Jun 04 '20 at 13:48
  • The reason the division of labour cannot cut through the middle of those two things is because it is the analyst's (if not the developer's) function to create a description of a business process which is sufficiently explicit, regularised, and devoid of human judgment and oversight so as to be computerised, and typically to design and arrange that process in a way that ensures the resulting human-computer interaction is also efficient. (2/2). – Steve Jun 04 '20 at 13:55
  • 2
    Also Flater, I think with the nuclear engineers (in the comments) that you are indulging an old fantasy that a prior general education can be replaced with more and more explicit text or instructions provided on the spot. In reality, to convey everything coherently, you will just end up replicating a physics educational workshop, except you'll be doing it with no teaching experience and you'll be doing it at a time when you're trying to get something specific done in a short time frame. – Steve Jun 04 '20 at 14:09
6

There's a distinction to be made between requirements and design.

Certainly, a software engineer of some kind is usually responsible for translating requirements into a design. However, it is not the SWE who owns the requirements themselves. Those must be defined by a person or team who works in or represents the domain; a business analyst, a product manager, a subject matter expert, or possibly all of them.

That being said, a SWE is often involved, for two reasons: (1) non-SWE's are often not aware of technical constraints or know what is possible, and (2) non-SWE's in general are very bad at defining rigorous requirements to the level of specificity required to build software. So an engineer could participate in the team or act as a reviewer as requirements are drafted.

But a software engineer would rarely be expected to be the single individual who defines what the domain needs, not in isolation. That would actually be a very bad sign, and worthy of pushback from the software team. Requirements must be defined by the people who are in the domain itself.

John Wu
  • 26,032
  • 10
  • 63
  • 84
  • In many businesses (both in my experience and in my knowledge), there is no separate analysis function. The "software team" are those who are expected to perform all functions relating to software production. And asking "people in the domain" (i.e. the users) to define the requirements would be a risible strategy. I spend a significant amount of time interrogating users for information and digesting what they tell me, but it is ultimately my responsibility, not theirs, to build a mental model of what their work involves, and then conceive the requirements in a coherent and explicit fashion. – Steve Jun 03 '20 at 23:46
  • I think maybe we are saying the same thing but with different words, that it is a team effort. You do, after all, admit that you "Interrogate users," because I think you acknowledge that they are the ones that drive the requirements, and that they have more knowledge about the domain than you do. You do not sit in a room and come up with the requirements in isolation. The fact that you are the one doing the typing and making up the diagrams and whatnot is an implementation detail. – John Wu Jun 04 '20 at 00:20
  • I interrogate the users to find out more about their roles, and tap their knowledge. But they are not doing anything as active as "driving the [software] requirements" - they usually have no idea what a computer can or cannot do, and (especially in the absence of prior analysis and/or computerisation) cannot usually give any systematic description of how they execute their jobs. It's not at all unusual for users to say (in good faith) things that are false, and if they have any requirements at all then they are often contradictory (sometimes subtly, sometimes blatantly). (1/2) – Steve Jun 04 '20 at 01:11
  • My point is not to understate the importance of the users, it's that I don't merely take instruction from them as originators in a pipeline. What they say to me is often a response to a question that I (and not they) have identified as pertinent, and their response is not simply taken at face value but is passed through the refractive lens of my skill and experience with myriad inferences and interpolations applied. I'm also often physically observing their actions and environment. Finally, whatever they are doing now precisely, is not usually what they or the computer does after. (2/2) – Steve Jun 04 '20 at 01:23
  • I agree entirely Steve, and that skillset you are using as you deftly ask the right questions is not the result of studying biology or law. It is an engineering skillset. That is my point. – John Wu Jun 04 '20 at 01:35
  • It actually is the result of my having a good deal of background knowledge! Of course there are general analysis skills involved, but actually having some prior knowledge of the domain (both conceptually and practically), and making it your business to learn it to some depth, can avoid significant calamities in analysis or software delivery. Questions arise and constraints are foreseen that just don't occur when you know nothing. I can't emphasise enough my real observations that the more a software team lacks those with prior intimate domain knowledge, the shorter the journey to disaster. – Steve Jun 04 '20 at 02:26
4

This is a problem that will solve itself in any project in an interactive, iterative way.

So you start with zero knowledge about the domain you get to make software for. Your client will be aware of this and will be eager to explain his problem to you because he wants it solved. And he will have a crude idea of how he wants it solved. So he tells you what he expects. Then you tell him what else you need to know and what else is possible that may help him even better. Then you make something and show it. Then he goes "almost right, but that's not how we work, when we fill in that form we do not know X yet, this only comes in at the next stage of the work flow". And so on. It is not like "we want you to make this, see you in a year, now go". That would not work but no one works this way these days.

And then there may be off-the-shelf software from companies that do know the domain because they have a history with it. But I understand your question is more about the first situation.

Martin Maat
  • 18,218
  • 3
  • 30
  • 57
  • And then 2-3 years later you look at what you've written and realize it would have been so much better if you'd known this domain knowledge at first (plus you have improved at software development as well!), so you -scrap it and start over- modernize it. Repeat every few years, as your knowledge grows. – user3067860 Jun 04 '20 at 21:37
3

There have been lot of researched and said about this topic.

One thing is clear : We shouldn't expect software developers to become experts in a domain, so that domain experts are not needed. The domain experts are still needed and the question is how that domain knowledge is transfered to developers in a way that can be turned into a usable software.

One way is the role of business analyst and detailed specification. In this mode, a dedicated (group of) people frequently meet with the domain experts and other stakeholders and try to elicit requirements out of them. Then, they write those requirements down as specification documents about what the software should be doing. They try to make the specification documents written in such a way that even developers with minimal domain knowledge can implement them.

Another way is to have domain experts as part of the team. This way, the domain expert can always provide her knowledge to the developers whenever it is needed. And developers whould slowly accumulate the domain knowledge themselves over time through osmosis. Also, the domain expert herself can point our possible improvements to the software that someone without domain knowledge wouldn't notice.

Next possible way is to have fast feedback from the domain experts and stakeholders not in the team. In this way, developers strive to release working software on short and reliable cadence (like once a week or even more frequently) and put effort into gathering feedback from actual users and stakeholders. This way, even if the developers don't have great domain knowledge, they can still build software that is highly usable to the domain experts and stakeholders. It also allows developers to learn about the domain and what is important through osmosis.

My personal opinion is that the business analysts and written specification rarely works. It is slow, innacurate and has minimal feedback loops build into it. But it being easy and obvious is how "traditional" software was made. Other two are much more "agile" and I believe they are much more efficient. But getting domain experts into a team and being able to provide a working release frequently is not easy or cheap. But it pays for itself and then some.

Euphoric
  • 36,735
  • 6
  • 78
  • 110
  • 2
    Here is a fourth model: find a person for the team who can be business analyst and senior programmer in one. That person can transfer the knowledge from the domain experts to the team by writing only minimal specs or code, or by teaching the rest of the team, without filling books of useless essays about the requirements. – Doc Brown Jun 04 '20 at 06:23
  • @DocBrown While that does seem different, I feel that it has tendency to converge to the first case of having a dedicated business analyst writing specs. As the person won't have time to deal with everything and will try to "save time" by writing things down insted of being interrupted and asked questions. – Euphoric Jun 04 '20 at 09:03
  • 1
    There is a huge difference: that fourth model works extremely well. It prodvides quick feedback, you have someone inside the dev team who can answer 80% of the domain questions directly, and speaks the developers "language". The drawback is it is hard to get people who are qualified for this job, and that person often can easily become a bottleneck. – Doc Brown Jun 04 '20 at 10:24
  • @DocBrown, I agree. Having some on a software team who are of an analytical nature and who have worked in a relevant department or role can be extremely effective. It wasn't that rare in decades gone by, particularly when firms often provided computer training for existing staff - in my view, the reason it is now rare is because it clearly celebrates a reliance on knowledge embodied in experienced staff, rather than embedded in documents and artefacts which the firm owns and controls. – Steve Jun 04 '20 at 13:23
1

You get it as you go along, the same way you get all the software knowledge you need. (You have to write lots of short functions, a small minority of which do something specific the domain needs; you don't have to write or even read a textbook on the science of it. The sooner you leverage that, the sooner it won't feel quite as daunting.) How many times have you googled how to code an algorithm, or which library function to use, or what an error message means? That's fine, you just absorb it into your working knowledge. The same thing happens with knowledge from other domains. A few examples from my experience will illustrate this, even if they're not very transferable:

  • "The code needs to return the correlation between two variables, of which zero, one or two are categorical. How do you even define correlation when one or both variables are categorical? In particular, what kind of "correlation" do psychologists want?" (Googles, reads Wikipedia, looks for library functions because someone must have done this before - ah yes, they have.)
  • "I'll have to try a few machine learning models and choose the best one. An easy defence of the choice of models is that they're all the ones I could find; scikit seems to have a lot. And how do you rank them anyway?" (Google, Wikipedia, library functions.)
  • "I need to link these data together into a graph. That'll take plenty of planning. I've never even heard of these datasets before. How are their variables related? Also, how do you convert that to graphs? I'll have to call Cypher & SQL from Python." (GWLF.)

That's all you have to keep doing. Solve lots of small problems, one at a time. They each only require a tiny piece of domain knowledge. Before you know it, you'll have a lot of that.

J.G.
  • 325
  • 1
  • 7
1

Domain experts who are not engineers usually can’t think in terms of engineering requirements.

Domain experts who can think in terms of engineering requirements but are not software engineers will often produce requirements that are incomprehensible from a software point of view. Worse, such requirements may superficially seem comprehensible.

One of the best programmers I ever worked with was an old professor who was a terrible software engineer. How can that be? Well, he wrote short, simple programs that embodied a clean idea of what he wanted as outputs, what inputs he expected to have available, and what the connection was. His codes were spaghetti (but that isn’t so bad if the code implements a clear vision). He kludged around numerical instability. His interfaces were inconsistent. He didn’t test adequately. Sometimes his algorithms were poor (and sometimes they were brilliant). But all his deficiencies didn’t really matter, because his programs served as clear definitions of what was needed. The deficiencies were easily repairable. It was the vision his code represented that was critical.

So, here’s my advice, which many software engineers don’t like. Don’t code from scratch. Get your domain experts to write prototype code. Some will turn out to be good at this, at least in terms of getting the wanted outputs from the expected inputs. Take the result and give it the full software engineering treatment.

John Doty
  • 119
  • 2
0

In the case of biology, there is actually the field of bioinformatics which can be studied at many universities. Also there are a lot of universities offering businessinformatics. So at least in some cases you can get formally schooled in the domain along with your schooling in informatics.

thieupepijn
  • 323
  • 2
  • 5