67

Possible Duplicate:
How to manage a Closed Source High-Risk Project?

I'm working on an institution that has a really strong sense of "possession" - each line of software we write should be only ours. Ironically, I'm the only programmer (ATM), but we're planning in hiring others.

Since my bosses wouldn't count the new programmers as people they can trust, they have an issue with the copies of the source code. We use Git, so they would have a entire copy of each of the projects they work on, when they clone the repository.

We can restrict access to them to a single key with Gitolite and bind that to their PC's, but they can copy those keys to another computer and they would have the repository access in another PC. Also (and the most obvious method) they could just upload the files somewhere else, add another remote, or just copy the files to an USB drive.

Is there any (perhaps clever) way to prevent events like these?

EDIT: I would like to thank everyone for their insights in this question, since it has been not only more eye opening, but also a firm support of my arguments (since you basically think like me, and I've been trying to make them understand that) against my bosses in the near future.

I am in a difficult situation work-wise, with my coworkers and bosses (since I'm basically in the middle) being like two gangs, so all this input is greatly, greatly appreciated.

It is true that I was looking for a technical solution to a people problem - both the management and the employees are the problem, so it can't be solved that way (I was thinking about some code obfuscation, perhaps working with separate modules, etc., but that wouldn't work from my developer POV). The main problem is the culture inside and outside the company - development is not taken seriously in my country (Venezuela) so naivity and paranoia are in fact a real issue in here.

The real answer here is an NDA (something that here in Venezuela doesn't completely work), because that's the people solution, because no sane developer would work in those conditions. Things will get ugly, but I think I will be able to handle that because of your help. Thank you all a lot! <3

AeroCross
  • 871
  • 1
  • 7
  • 11
  • 117
    It sounds like you are asking for a technical solution to a people-problem. If we take this a step further, how would you prevent the new programmers from committing the lines of code they saw / wrote to memory and copying them down later? –  Oct 17 '12 at 16:32
  • 152
    Wow... remind me to never apply for a job there. – Walter Oct 17 '12 at 16:35
  • 35
    If your boss don't make the difference between intellectual property and real property, then he is likely to be as competent in the business than he is technically. Is he pointy haired ? – deadalnix Oct 17 '12 at 16:50
  • 34
    There is an easy technical solution to this problem. Don't write any code. No code, no leaks, no problems (other than the problems that would have been solved by the code). – emory Oct 17 '12 at 17:08
  • 4
    Just to help clarify this issue both for you and for people responding, whether or not you can trust people won't steal the code only really matters for the (paranoid?) worry that a new person would steal all your code in the first week and then leave. Beyond that, trusting that the code they are writing is any good is the bigger issue. As in, if you are trusting their code enough to rely on it for your business, then it seems small potatoes to trust them with a copy of the codebase. – jhocking Oct 17 '12 at 18:33
  • 2
    @jhocking - the truly paranoid would hire multiple developers (preferable unaware of each other) to do the same work. A voting system would collate their inputs. You would not have to trust any developer, just trust that a majority of them are not making the same screwup. Not that I think it is a good idea. – emory Oct 17 '12 at 18:47
  • 35
    I've worked for people like this. Their companies never grow past a certain point because they are unable to give up control and ultimately fail in some area. I don't think I'd consider this a long-term career although being the first programmer in a company can certainly give you some great experience (Does he mind you taking THAT with you when you leave?) – Bill K Oct 17 '12 at 19:20
  • 7
    As others have already pointed out, you need to consider which risk is bigger: having the code leaked or hiring people who are not passionate. Passionate people don't work where they don't feel trusted, sometimes like to tackle problems from home, and hate having obstacles put in their way by clueless managers. Some of them also prefer their laptops. Obviously, there are some counterexamples where it's strategically important to at least divide codebase between employees, like at Apple or Microsoft. Are you creating iOS, or Windows, or what? – Dan Oct 17 '12 at 19:32
  • 13
    If you're competent and intent on stealing code you will succeed. – MrFox Oct 17 '12 at 20:07
  • 5
    You'd have to prevent camera phones, printers, and everything. Heck, we could easily hide our code in an innocent looking image and sneak that out. I've worked at a place that locked down USB ports, blocked tons of sites. Monitored email. and generally restricted access heavily. And there's still dozens of ways to get code out. Ironically, it also made it hard for me to bring something in if I alread had the solution at home! – CaffGeek Oct 17 '12 at 20:30
  • 1
    @emory ISTR reading that sort of setup being tried with 3 parallel teams as an error prevention setup (assuming in most cases only 1 of 3 would get it wrong); but that it was a miserable failure in practice. – Dan Is Fiddling By Firelight Oct 17 '12 at 21:11
  • 4
    @CaffGeek not to mention you'd have to wipe their brains off memories at the end of each work day. The boss is always complaining about devs not remembering what they did yesterday. – emory Oct 17 '12 at 21:15
  • 6
    Unless you work for a government institution where secrecy actually matters, this is a recipe for disaster for the company. If you're so worried about your stuff being stollen, you should go do one of those silly patent and trademark and copyright nightmares that everyone seems to be so into right now. – Linuxios Oct 17 '12 at 21:34
  • 3
    Good luck hiring ... I wouldn't work for a company that didn't trust me with the source code, and I know no good developer who would. – Duncan Bayne Oct 18 '12 at 01:57
  • 3
    Lock all the computers, remove DVD drives, fill all the USB ports with putty, no printers, no internet connection. But what a miserable place to work! – sgud Oct 17 '12 at 18:13
  • 5
    I disagree with most people here. Here's a **real life** situation from a big company I worked for: New programmer is hired. New programmer is very enthusiastic. New programmer wants to work over the weekend! New programmer copies entire code base to USB, so he can work from home. This is a MAJOR security breach, for every company! Luckily - IT security did they job - there were tool in place to **detect leaking of sensitive data**. USBs work, emails work, but you cannot send all types of data. This is not paranoia, but common sense. – Kobi Oct 18 '12 at 07:15
  • Many of these answers (not the comments) are not distinguishing between source code and data. There are established protocols for dealing with sensitive data e.g. one of the answers mentioned it, about Protected Health Information. Code leaks, which was the specific question, is different, and from my experience, a more difficult matter. Regardless, suggestions such as what @Kobi stated is just good sense, and has nothing to do with paranoia. – Ellie Kesselman Oct 18 '12 at 07:43
  • 4
    **Comments are not suitable for extended discussions.** Please refrain from commenting if you don't have something valuable to add to the question, if you just want to talk about it we have [chat](http://chat.stackexchange.com/rooms/21/the-whiteboard), comments are only meant for clarifications. – yannis Oct 18 '12 at 14:36
  • The very point of a Development Team is to work together to solve the problem that the program needs to solve. I may know how to do this, but I'm getting a compile error when trying to build my new code and have no idea what's causing it. I shoot an email to a coworker, he comes over when he has the time (after he's finished with his block he's on...say 5-10 minutes). I work on something else for a few minutes, he comes over, we look it over, bam he see's I misspelled something. That's only a few minutes vs. banging you head for hours. It just simply cuts the idea of a team out of the equation – Randy E Oct 18 '12 at 14:40
  • I think it's more important to ask how to **reduce** unnecessary code leakage than it is to ask how to *prevent* code leakage. It'll happen either way, whether it's on purpose or accidentally. – zzzzBov Oct 18 '12 at 19:20
  • @Kobi Emails work? Okay, then then there's a very good chance a zipped tarball of code in an email will also work... Especially if you changed the extension. – Izkata Oct 19 '12 at 11:02
  • @Izkata - Indeed, too clever. My point is not that you cannot bypass the protection by easy means. Not all leaks are caused by malice. For example RMS can encrypt all documents, stop you from sending them in mails, copying data, and even take screenshots. This send a strong message to people: This data is important. So yes, you can use your camera to take a picture of the screen, but it achieves its goal nonetheless. It is naive of most answers here to assume you should do nothing, trust people, and everything would be OK. – Kobi Oct 19 '12 at 12:01

14 Answers14

137

This is one of the situations where you are looking for a technical solution to a social problem.

A social problem should require a social solution, which, in this case, takes two complementary forms and an additional organizational solution which may help:

  • Trust. If you don't trust developers, don't hire them. Working with people you don't trust is synonymous of failure. Relations based on mistrust require a lot of formalism, which may severely impact not only the productivity of your employees, but also the number of persons ready to work with you. Chances are, the best developers will avoid your company at all costs.

  • NDA. Trusting someone doesn't mean you shouldn't take legal precautions. Those precautions can take a form of a contract or a NDA clause with severe consequences for the employee in a case of a disclosure.

    How severe are the consequences depends on who you are. Government organizations, terrorists or mafia can permit some deterrent ones. Ordinary companies may be limited, by law, to financial ones only.

  • Slicing. Trust and contracts are a good start, but we can do better. If the sensitive part of the code base can be sliced so that two or more parts are required for the product to function, make sure that the developer from department 1 never sees the source code developed in department 2, and vice versa.

    People from one department shouldn't be able to meet people from other departments, and ideally, they shouldn't even be able to guess what other departments are doing, nor how much departments are there. Each person knows only a small part, which is not enough to have an entire picture (and reconstruct an entire product outside the organization).

Those were social and organizational measures.

Now, technically speaking, there is nothing you can do.

You may try to:

  • Force the developers to work in a closed room on a machine which is not connected to the internet and doesn't have USB ports.

  • Install cameras which monitor everything which happens in the room, with several security officers constantly observing the developers working.

  • Strip-search every developer each time he leaves the room to be sure he don't have any electronic device which can hold the code.

  • Require every developer to have an ankle monitor. The device will listen to what they say, record their position and attempt to detect any electronic device nearby. If the developer was near a device which is not identified and doesn't have your tracking software installed on it, private investigators and hackers may attempt to check whether the developer wasn't using the device to leak information.

  • Forbid developers to leave your buildings, unless being under heavy surveillance, and to interact in any way with the outside world.

Some or all those measures are illegal in many countries (unless you represent some government agencies), but the worst part is that even with all those measures in place, developers will be able to get the code, for example by discretely writing it on their skin or on a piece of paper and hiding it in their clothes, or simply memorizing it if they have Eidetic memory.

Or they can just globally memorize the data structures and the algorithms—that is the only important thing where intellectual property matters—and create their own product inspired by those two things.

Arseni Mourzenko
  • 134,780
  • 31
  • 343
  • 513
  • 41
    -1 - Of course there's something you can do. Do you think armored car companies trust all of their employees? Do banks have to trust all their developers? They all have various security measures in place to prevent both tangible theft, and IP theft isn't that much different. How do you think defense contractors handle it? This is a misleading and provably wrong answer. – Scott Whitlock Oct 17 '12 at 17:29
  • 64
    @ScottWhitlock I'm sorry but how do you stop people from memorizing a block of code they're writing and then going home, rewriting it from memory and selling it? Though this is a ridiculous fear, it is 100% possible and 100% unstoppable if an employee so chooses to do it. This is precisely why MainMa is spot on that you must *trust* your developers. (As well as having contracts and a legal team to enforce trust violations) – Jimmy Hoffa Oct 17 '12 at 17:47
  • 25
    @JimmyHoffa - if you're talking about a code base of any significant size, then the amount you can memorize is insignificant compared to what's at stake. That's like saying that since people can steal my garden gnomes, what's the point of locking my car? People tend to take the easy route, and that's making a digital copy. – Scott Whitlock Oct 17 '12 at 17:49
  • 8
    @ScottWhitlock I'm not speaking about stealing the entire code base, there are of course plenty of security tactics available to make that difficult, I'm just saying that regardless of security measures; you need to trust your developers, because you can't *completely* stop them from acting against your interests. – Jimmy Hoffa Oct 17 '12 at 17:54
  • 2
    @JimmyHoffa - I'm sure we're on the same page then. I'm suggesting a layered approach - technical barriers to bulk copying, and NDA + good background checks, social engineering, training, threatening, etc., to deter smaller scale stuff. – Scott Whitlock Oct 17 '12 at 17:58
  • 2
    @ScottWhitlock yes, money is very different, so your banking comparison don't make any sense. Money do not duplicate like code do. Obviously you can take security measures, but a level of trust is always required. – deadalnix Oct 17 '12 at 19:02
  • 3
    I agree with @ScottWhitlock - I had training at several places I wrote code (Apple, NASA, FBI) that included common-sense ethics stuff I learned in my BS degree, as well as signing NDAs, reminding folks what the penalties were for violating copyright, etc. At NASA nobody on my team could access the IT resources until they had followed the training on MTCR, which included videos. The machines were locked down with keyboard spies and reminders (every time you logged on) that everything you typed was recorded. – Fuhrmanator Oct 17 '12 at 19:42
  • 25
    @ScottWhitlock, you don't need to take the entire code base, just the important proprietary bit. Any decent dev can recreate what they've done a second time if they need to. The value is in the novel approach to solve the problem. That's what developers do, sovle problems. Code is merely the recording of the solution. And the real IP, the solution, the intangible thought that solved the problem is impossible to protect. If I'm working on the next big thing at apple, they can't wipe my brain from knowing what it is...and if I know what it is, I can recreate it. – CaffGeek Oct 17 '12 at 20:36
  • 1
    +0 While I _want_ to upvote this answer, there are ways to help the OP accomplish at least a little part of his goal. See some other answers for examples. – Phil Oct 17 '12 at 22:32
  • 1
    @ScottWhitlock you've obviously never heard of samuel slater. – electron_avalanche Oct 17 '12 at 23:13
  • 1
    It is really best answer. The main problem of IT projects is "how make this work", not "how to prevent this from leaking". I had worked in many companies (more then 5) and after leaving I had removed all source. Because code it is only part of the business and not the full business. So if it neccessary to prevent leaking code, how to prevent leaking team, mathemetical ideas and other non-controlled things? Answer is simple: >> Technically, there is nothing you can do. – pinocchio964 Oct 18 '12 at 12:17
  • 2
    Because it's possible to pick a lock, should you never lock a door? Just because you can't make something theoretically impossible doesn't mean you can't create obstacles that deter people. Defense contractors DO make it difficult to take their code home. – MetricSystem Oct 18 '12 at 16:01
  • @CaffGeek, That depends on the nature of the business you are working on. For most, there's nothing revolutionary about the code. Sure, you *can* re-create the code, but there's no way to do that as fast (nor as cheaply) as a thumbdrive transfer can. – Pacerier Oct 04 '15 at 20:35
70
  1. Make them sign a non-disclosure agreement.

  2. Only hire people you trust.

  3. Compartmentalize your code base. Use of dependency injection so you can give them requirements that, when finished, resulting classes would fall right into place into the existing architecture, but they will not have acces to the "complete picture", only loose pieces. Only senior, trusted people would have clearance to the "architectural glue" that makes all work as a complete whole.

Tulains Córdova
  • 39,201
  • 12
  • 97
  • 154
  • 18
    I wouldn't recommend the last one, because it would prevent any dev to do smart stuff. Any competent dev would runaway from such a situation. The conclusion is not hard to deduce. – deadalnix Oct 17 '12 at 16:48
  • 20
    @deadalnix: #3, if done properly, makes code much easier to maintain; it forcefully avoids coupling. I'd say this approach is more common on very large projects. That being said, I don't think the OP's company is at the scale needed to justify it. – Brian Oct 17 '12 at 17:05
  • 15
    I have nothing against separation a project in different parts. I have something against hiding the larger picture. This is making your devs blind on purpose. Do you think ANY dev that can choose his/her work will accept to play this stupid game ? – deadalnix Oct 17 '12 at 17:33
  • @deadalnix Maybe not all developers will have to work that way, only those who doesn't have enough seniority or haven't earned anough clearance. – Tulains Córdova Oct 17 '12 at 17:57
  • 13
    @deadalnix given enough money yeah. I worked for a DoD contractor once. Closed environment, two separate machines (one with Internet Access for research one without for coding), and it was obvious that what we were coding wasn't the full product. Everything you could complain about a software project was there. But boy did they pay! It was soul crushing but I looked at my bank account every week and continued on. – Michael Brown Oct 17 '12 at 18:28
  • I have to disagree with #3 - the exorbitant architectural costs of compartmentalizing your codebase to protect against what is a people problem is not worth it. In fact I would say that even splitting up your codebase based on VCS permissions is too expensive to justify. Look at the additional cost in unecessary merging, ofuscation from devs who are just trying to be productive and infrastructure to support this notion. Its why OSS will always win. – deleted_user Oct 17 '12 at 20:33
  • 9
    @stackmonster An example: the developer or an Eclipse plugin doesn't have to have access to the Eclipse code, only to the plugin interface definition. I suspect that in a company like Apple, for instance, very few developers have access to the whole code base of iOS. – Tulains Córdova Oct 17 '12 at 20:37
  • @user1598390: if you needed access to eclipse's source code to write plugins, it would be an appalling plugin architecture. The same with applications on iOS. That isn't solving the same problem, that's solving the problem of it being completely unworkable to develop any other way on systems that large with requirements like that. – Phoshi Oct 17 '12 at 21:36
  • @user1598390 I'm pretty sure every apple dev have access to sevral BSD distro and Mach kernel. – deadalnix Oct 17 '12 at 22:30
  • It's a legal issue indeed, let them sign an agreement. – Carra Oct 18 '12 at 07:11
  • The third option is also a way to easily introduce bugs in other sections of the code by not knowing how it's all going to work together. A Dev needs access to the entire code base to be affective in almost any situation. – Randy E Oct 18 '12 at 14:19
  • @randy-e Do you think that appies also to humongous projects like OS X, Linux, etc. Does the developer of functionality "A" need to study and know the other hundred million lines of code ? Or Has he/she to know only the specificatios of his/her part of the project ? – Tulains Córdova Oct 18 '12 at 14:51
  • "...in almost any situation." The vast majority of programmers aren't working on projects like an entire operating system...common sense should apply to the term almost any situation. – Randy E Oct 18 '12 at 14:53
  • @randy-e I guess in very small projects. – Tulains Córdova Oct 18 '12 at 14:54
  • Even on large projects, developers should still have access to the entire source because my code could be interacting different than expected with another persons code, but I wouldn't know that and wouldn't be ableto know the best way to fix it without being able to see their code. – Randy E Oct 18 '12 at 14:58
  • 1
    @RandyE That's what object oriented design is for. You design interfaces and abstract classes other people has to implement or extend in order to work with the rest. You only have to comply with a contract, i.e. method signatures, interfaces etc. It's the same auto-makers work with part-makers: they issue engineering specifications. The part maker doesn't need to know what exact car model will be using a certain spark plug or clutch. They only have to follow custom specifications and quality standards. The automaker doesn't need to leak model designs to part-makers. – Tulains Córdova Oct 18 '12 at 15:05
  • let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/6160/discussion-between-user1598390-and-randy-e) – Tulains Córdova Oct 18 '12 at 15:29
  • @MikeBrown, Cool =) Btw are they paying *way* above Google rate? – Pacerier Oct 04 '15 at 20:39
45

I love the idea there might be a "clever" idea that "we" as developers would be baffled by. Given that every developer tool written was written by a developer and all that.

Your boss's biggest problem is naivety with a dash of paranoia. I'm being polite there. Really really polite.

If you really want a shopping list of things to keep your code proprietary, just implement the following:

  1. Disable USB and other IO on all company computers. This can be done through most enterprise anti virus or similar.

  2. All developer machines to be desktops or towers. No laptops.

  3. Do not allow any machine to connect to the internet. No web, ftp, email, IM, no internet. Cut the wires.

  4. No remote working/access (sort of covered by no internet, but some smart spark might suggest a VPN)

  5. No mobile phones or other electronic devices to be taken into the secure "development" room.

  6. Configure all printers to print a large visible watermark on every page, front and back.

  7. bag searches both in and out. Search for hand written notes, anything printed on company printers (anything, they might have hidden code in an image with steganography!). Any electrical or electronic devices. In fact, probably best to ensure bags are kept out of the secure area and developers should wear clean suits, the sort of thing you'd see in drug dens and chip fab plants.

  8. Servers should be equally isolated, back ups should be encrypted and only your boss should know the password to restore from them.

Brendan Long
  • 806
  • 8
  • 12
Ian
  • 5,462
  • 22
  • 26
  • 5
    And even then, good old memory works great. And without cavity searches... – CaffGeek Oct 17 '12 at 20:44
  • 11
    Hah, you know I started adding a 9) developers might remember something, but couldn't think of a way to fix that besides filling the developers with vodka :) – Ian Oct 17 '12 at 20:51
  • but I'm smarter frunk, or at least I deel that way. ;) – CaffGeek Oct 17 '12 at 21:17
  • 25
    9. No outside phone calls allowed from landlines etc (they could tell someone the code over the phone). 10. Developers must work in windowless room then they can't shine a torch out the window and transmit the code via morse code. 11. Developers must be [neuralized](http://en.wikipedia.org/wiki/Neuralyzer) after completing each day of work. – zuallauz Oct 17 '12 at 21:19
  • This is the kind of ridiculous security that only top secret government data needs. Although I think it's reasonable to have a government programmer in the NSA doing that, I can't imagine any civilian company needing to go that far. +1. – Linuxios Oct 17 '12 at 21:37
  • 3
    For number 9, you can use Jason Bourne style behavior modification training. Everyday before employee goes home. – hrishikeshp19 Oct 18 '12 at 01:39
  • 5
    continuing zuallauz's list, #12 developers go to work nude to show they don't tatoo code. #13 developers do not brown bag lunch lest they arrange their potato chips in code instead of eating them. – zundarz Oct 18 '12 at 04:31
  • you forgot brainwipe at 5pm every day. – Johnno Nolan Oct 18 '12 at 10:17
  • 8
    why send them home? Keep them in the building, food and drink passed in through an airlock system only. – jwenting Oct 18 '12 at 11:28
  • 17
    14. Don't ever run software written by developers. It can contain malicious code that steals confidential data and sends outside the company – Adam Dyga Oct 18 '12 at 14:16
  • I'm baffled by "clever" asymmetric encryption algorithms used on the web, and I am unable to bypass them despite being just a developer tool written by "us" developers. Care to enlighten me? – Allon Guralnek Oct 18 '12 at 18:35
  • Allon: I don't really understand how the software that grants Root to my Android phone works, but a developer did and I used the software they wrote to bypass security on my phone. For your example, knowing how encryption works (probably) won't allow you to bypass it. That wasn't the point. – Ian Oct 18 '12 at 20:32
  • 4
    I heard a story told of a great theif. He went to major trade shows, carrying nothing. On leaving the first day of one particular trade show, he told a guard that he was indeed a great theif, having stolen untold billions. The guard searched him, but found nothing. Again and again the theif returned telling tale of the great wealth he had stolen, and again and again the guard found nothing on him. Perplexed, the guard asked the theif how he carried out his great thefts. He replied, "I steal ideas." -- You do not need to copy code. You need only the ideas contained within. – greyfade Oct 20 '12 at 09:07
35

If the people in question can't be trusted to keep to their employment contracts, then he needs to not hire them.

If he believes NO ONE can be trusted, then he's being overly paranoid, and he's ultimately going to damage the company if he keeps it up.

At some point you MUST trust your employees. It's not really an option to do otherwise. If you don't trust your employees at all, then they can't possibly be effective, because you're spending too much time distrusting them, and wasting a lot of their time jumping through hoops due to trust issues.

Also, when you make it clear to people that you don't trust them, they tend to get annoyed. And annoyed programmers eventually find another job where they aren't treated that way.

The right solution is a basic background check, a well-thought out employment agreement, and some trust.

Michael Kohne
  • 10,038
  • 1
  • 36
  • 45
  • 9
    "If he believes NO ONE can be trusted, then he's being overly paranoid, and he's ultimately going to damage the company if he keeps it up." - VERY TRUE. – GrandmasterB Oct 17 '12 at 21:07
22

I used to write code for classified computer systems. They had all sorts of ridiculous hoops to jump through to keep it secret. For example, we weren't allowed to bring music CDs into certain rooms because they could be CD-RWs in disguise.

The thing is, practicalities of the work open up security holes out of necessity. Sometimes you had to transport non-classified data/code into or out of a classified area in order to get the job done. Yes, there were rules and procedures for that, too, but they all eventually boiled down to trusting people. Instead of trusting people not to put classified data on a convenient usb thumb drive, you're just trusting the exact same people to jump through all the security hoops.

In other words, there are a lot of things you can do to protect from outsiders, but once you let them in, it's pretty much just a question of how much you want to annoy them.

Karl Bielefeldt
  • 146,727
  • 38
  • 279
  • 479
  • 4
    Some of those security hoops are for other reasons. For instand usb thumb drives often carry malicious files. So following the procedure to transfer files off a classified system, prevents the system itself from being compromised, indivual files are protected by other security rules. – Ramhound Oct 17 '12 at 19:09
  • and to prevent files being taken out, you can customise your operating system to automatically encrypt everything it writes to any destination, and decrypt it again when reading. Anything not encrypted yields only read errors, anything encrypted is useless outside the company. Worked for one company that did that (albeit in hardware, they used superglue to glue shut any serial, parallel, etc. port on the computers). – jwenting Oct 18 '12 at 11:32
  • "For example, we weren't allowed to bring music CDs into certain rooms because they could be CD-RWs in disguise." -- Thanks Brad! – Gareth Davidson Oct 18 '12 at 13:37
16

In short, you need to have a non-disclosure agreement/contract with employees that you are hiring. In addition to this signed agreement hire developers that you can trust.

Technically speaking that code can easily get copied to device and reused somewhere else. What your boss doesn't want is - access of your competitors to this code. You may only enforce such policies through contracts of employment or partnership.

Another thing that may work is provide training on sensitivity of information, how each employee should be alerted when confidentiality of information is violated. In addition, tracking computer activities within your office network will also raise warning level.

Similar type of training is done annually for employees who deal with PHI related information .

However, getting a trusted people on board and providing training how to protect this information might be the optimal way to go.

Yusubov
  • 21,328
  • 6
  • 45
  • 71
  • 2
    +1 for mentioning **training**, a crucial point that other answers seem to be overlooking! Developers are intelligent people; showing why the security procedures are necessary is a far better approach than trying to make the procedures watertight. – Peter LeFanu Lumsdaine Oct 18 '12 at 16:20
13

Did you see Paycheck with Ben Affleck? I think that's the only way to guarantee your IP won't be "stolen". I can tell you that because of my memory, I could recreate almost every system I've worked on for any significant amount of time. If not a line by line recreation, I could produce the significant elements and probably improve on them in the process because of how my skills have grown over the years.

Plain and simple, there are only two things you can accomplish by "locking down" your code:

  1. You'll repulse good developers who are offended by your obvious lack of trust
  2. You'll raise a great big flag that tempts the more unscrupulous developers

If you think your software is that novel and amazing, get a patent.

Michael Brown
  • 21,684
  • 3
  • 46
  • 83
  • 4
    If you think it's hard to prevent one of your developers from copying your IP and selling it to a Chinese company, consider how hard it would be to sue that Chinese company for patent infringement. It could be that preventing your IP from being copied is actually much less expensive than the alternative. – Scott Whitlock Oct 17 '12 at 19:27
  • 1
    If you think your software is that novel and amazing, chances are, someone else has already written it. And while you're filing for a patent, they're making the software better. – greyfade Oct 20 '12 at 09:10
11
  1. You can't prevent code from leaking. You can limit the leak to less important code.
  2. Normally there is only one part of the application that makes it unique. That would be some algorithm. You can interface this very well, and put this in a separate source control branch. You and only those people who need to work on it should have access to it. You provide only a binary (obfuscated) and the interface to the other developers.
  3. Divide your code into more modules and let people only work on a particular module. The other modules are provided in binary (obfuscated). This, however, can be stretched too far that it will be unmanageable, and can lead to duplication of code.
jokerdino
  • 105
  • 1
  • 1
  • 4
tofi9
  • 211
  • 2
  • 7
9

I know someone who worked in an environment like this one.

A few measures that were taken there:

  1. No access to the physical computer, all the computers were stored in a locked room with holes in the wall for display, keyboard and mouse.

  2. No internet access.

  3. Customized operating system that used an in-house-built file system with encryption (in case a computer were to get stolen somehow).

  4. Projects were divided in static libraries, no one had access to the whole source code (this of course does not apply for all programming languages).

  5. It was a very unpleasant workplace.

Brendan Long
  • 806
  • 8
  • 12
yms
  • 188
  • 1
  • 8
  • 2
    And I can still think of ways to get data...I'm sure there's a recorder device I can tie into the monitors data wire. – CaffGeek Oct 17 '12 at 20:47
  • 3
    Or I could get a fake eye with a tiny camera and recorder in it! – CaffGeek Oct 17 '12 at 21:03
  • @CaffGeek: Or simply use the camera in your phone. I'm sure it wouldn't take someone a lot of effort to implement an OCR converter to turn the pictures back into a text file. And I know my phone will automatically upload all pictures I take, so I could take them, wait for them to upload, then delete them from my camera and then peruse them at my leisure when I get home. – TMN Oct 17 '12 at 21:08
  • 2
    You can block cell signals from escaping the room. And you wouldn't be allowing the cells into the room...hooray for friskings! – CaffGeek Oct 17 '12 at 21:14
  • 8
    I know of a workplace that sounds a lot like this. They're called the NSA. In addition to doing things like this to prevent source code from being stolen by interested outsiders, they also subject their new hires to very detailed background checks to be sure they can trust them. – Ken Bloom Oct 18 '12 at 02:54
  • 2
    @KenBloom Hi! My name's Bradley Manning... – itsbruce Oct 18 '12 at 09:37
  • @itsbruce I don't believe all that. As someone who has had a security clearance (Top Secret) while I was in the Army, I know for a fact that you only get access to theinformation required to do your job. An Army Private (or even a Specialist before he was demoted) would not have access to State Department cables unless they were passed off for intel analysis by the Army...by a Private...which isn't going to happen. – Randy E Oct 18 '12 at 14:28
9

Be careful not to overestimate the value of your source code to those outside your company.

Sure - you paid a lot of engineers (Developers, QA, etc) lots of money to develop it, but that doesn't mean it's intrinsically valuable to a third party.

What exact attacks exist?

Source code is often leaked from (e.g.) games development or IT security companies. Of course, such leaks make them look bad, but otherwise, cause no real harm.

Ask yourself:

  • Are you so embarrassed about the quality of your source code, it would be damaging to your reputation to leak it?
  • Does the source code itself contain confidential secrets (e.g. hard-coded encryption keys, details of financial relationships with other companies)? Should it?
  • Could a competitor genuinely benefit from using your source code, despite the risk of being discovered?

As far as I know, there was a case where a member of staff at one IT security company tried to sell the source code to a competitor. The competitor immediately reported this to his employer, and he was soon dismissed. No money changed hands, but the staff member can't work easily in the IT security industry any more.

MarkR
  • 201
  • 2
  • 4
  • 2
    +1 for reality checks. There were a number of big source code leaks in the past (e.g. [Half Life 2](http://en.wikipedia.org/wiki/Half-Life_2#Leak) ), with apparently relatively little damage to the company. As a matter of fact, e.g. MVPs and governments can even [officially obtain the source code for Windows](http://www.microsoft.com/en-us/sharedsource/mvp-source-licensing-program.aspx). – sleske Oct 18 '12 at 10:27
  • @sleske yes, but under very strict conditions of confidentiality. There's that trust thing again... – jwenting Oct 18 '12 at 11:36
5

You have to trust, but sometimes it's better to just monitor for infractions. For instance, if your code runs it could phone home to some IP address on the internet whenever it's run, logging the IP address, computer information, perhaps even Geo location of where it's running. Review that log to look for problems.

Of course there are ways around that, and developers could just be looking at the code, not executing it.

You could configure your firewall to do packet inspection with something like Snort that could a) block the connection immediately if it detects, e.g. your copyright notice, and b) report it to management for follow-up. In order to get around SSL and HTTPS you'd need to be running an HTTPS proxy on your firewall.

Perhaps you could have a system utility on every PC that checks external drives and USB flash drives for specific files or key phrases while they're plugged in. You could also disable/ban external drives (I heard the US DOD does this). You'd have to require they use your hardware, and not bring in any hardware from home.

There are probably programs available that will log everything someone does with a computer. Just schedule random audits of these logs, and make sure developers are aware of this capability.

Once you've found an infraction, hit the developer over the head with the NDA they signed.

Scott Whitlock
  • 21,874
  • 5
  • 60
  • 88
  • Your first paragraph suggests that you are going to host the application. And how are you going to monitor the developper who is going to write that "feature"? – Simon Bergot Oct 17 '12 at 17:24
  • 1
    @Simon - presumably you'd get the "trusted" one to write it, or you'd have two different developers write two different features in isolation. :) Don't think the downvote is fair. I wasn't suggesting these were foolproof ideas, just ways to make the security better. – Scott Whitlock Oct 17 '12 at 17:26
  • Security theater always seems better. – BryanH Oct 18 '12 at 18:23
5

A technical solution could be to have all development sessions hosted on a server with no (or severely restricted and monitored) network access beyond that required to serve the RDP (or VNC) sessions. That way, the source is never on a developer's machine. It would also make it possible to work from home or a client site.

Ultimately, though, I think the whole situation is ripe for failure. If you make it obvious to developers that you don't trust them, and make their jobs harder, then IMHO you're creating an environment where developers will find some way to make the source public, just to "stick it to the man".

TMN
  • 11,313
  • 1
  • 21
  • 31
2

You are working for people who simply do not understand how software engineering works. Worse: they don't value it (only what they can get out of it). It is not going to be possible to work productively for them; ultimately, they will punish you for this. Find another job.

itsbruce
  • 3,195
  • 17
  • 22
-4

While I do agree that its a recipe for disaster because no programmer will stay there long enough and you will be spending more time trying to understand other people code, some things come to mind:

  • Block internet access
  • Block any third party access (USB, card readers, etc)
  • Encrypt and save elsewhere any finish line of code. Thats it, turn every code into an API so programmers could not access the functions any more.
  • That brings: Separate programmers from debuggers.

Anyway... prepare to have some high movement of people and high salaries if you want them to stay there. You are creating a very bad working environment.

Keep in mind you could basically write much code from all the free examples that out there in the Internet. Most programming is actually re-using. You are going to block and delay your workers.

  • 16
    Saying that you don't read other answers is just a magnet for downvotes. – vsz Oct 18 '12 at 08:03
  • 2
    Sometimes its hard to understand this community. I really doubt all the people here read everything, I was just being honest since like most people here, I am quite busy and just wanted to add something. Yet, someone that only says "I will never apply for a job there" and bring nothing to the post, its given 128 upvotes. – sebastianf182 Oct 18 '12 at 20:26
  • Upvotes on comments are worth far less than upvotes on answers. – Keith Thompson Oct 18 '12 at 22:37
  • I know. I don't care about the points. I am saying. Thats completely off topic yet people punish the one that actually gives an answer... – sebastianf182 Oct 19 '12 at 01:50