I'm beginning a new project at work, and will likely be nearly the sole developer on the project, though one or two other developers will need to integrate existing applications or simple scripts into the main project. The project needs to handle small-scale bulk and streaming data ingest/processing, and both event-driven and on-demand code execution. Some parts of the framework will be heavily CPU bound, and some parts could be heavily I/O bound; most of the data must live on a single machine, but we're able to create a cluster and connect VM's to increase available compute power. There will probably be one or more small web applications that depend on services provided this core framework. The main language will be Python for just about everything.
My question is whether or not I should take a microservices approach to an effort like this or stick with a monolithic application, given that I'll be doing most of the development by myself. My thought is that microservices (using Nameko) provide a natural separation between elements of the framework that have different execution models (data pipelines, event-launched, on-demand, web applications, etc.) and a clear way to distribute the workload and communication across multiple processes. My concern is that I'd probably end up with a Kubernetes cluster to manage (I'm familiar with Docker, but still fairly new to Kubernetes), multiple services (rabbitmq, redis, etc.) required just to facilitate running the system, and potentially a lot of small chunks of code to actually implement all the necessary capabilities that we'll need in the framework.
For a project with little more than a single developer, do microservices still simplify developing and maintaining a complicated system like this? Are there methods/systems/frameworks I should consider using instead, or to reduce the overhead involved in designing the system this way?