2

I have 3 (python) git repositories that I need to work with. One of them is the top level application (packageA), and the other two (packageB, and packageC) are dependencies of packageA. (Note: none of the 3 are available on PyPI).

I've got a virtualenv setup, and have each of my 3 packages checkout within. Option A) Use pip --find-links to install packageB and packageC within the env. And add it to the requirements.txt of packageA. Works great, in that this tracks the dependencies of packageA as a part of my git repo (I will check in requirements.txt). However, the problem here is that (as I understand it), any changes I make to packageB/packageC won't get reflected until I package it and re-install it.

Option B) Manually add .pth files (with paths pointing to packageB and packageC) to the site-packages folder directly. However, then I cannot add packageB and packageC to my requirements.txt.

Is there a better way to manage these dependencies?

Also, how would I "package" all 3 together to deploy them elsewhere? Say, AWS EC2? A custom build script? Or, is there a better way to do that as well?

COLblindOR
  • 37
  • 1

1 Answers1

1

As pschill suggested, you can install packageB and packageC with in editable mode with pip install -e in the virtual environment. Then changes in these two packages are immediately active. You can also use this with requirements.txt. (The only caveat is compilation: if either package requires compilation, like in cython, you'll need to do that manually).

teekarna
  • 111
  • 2