Writing compilers in general is a solved problem. We have figured out the basic principles of how to convert a character stream into tokens into abstract syntax trees into assembly code into machine code, and design languages which are easy to create compilers for.
The Dragon Book (https://en.wikipedia.org/wiki/Compilers:_Principles,_Techniques,_and_Tools) from 1986 documents this very well, and is highly recommended.
I once read that compilers tend to be the most simple when there is zero, one or infinitely many of a given resource, like registers or stack space. This is also why modern compilers are rather complex (at least at the machine code generation stage) - namely because there are more than one but very less than infinitely many registers or cores (in a GPU) and generating the optimal code is simply hard. Especially because processors with the same instruction set may have different characteristics. The answers to https://stackoverflow.com/q/11227809/53897 not only demonstrate that different cpu's have different performance simply because of their branch prediction implementations, but also that different compilers can provide very different code - the Intel C++ compiler swapped two loops and avoided the problem asked about.
That said, what you are asking is if there has been any advances in tooling for creating compilers.
What you might be missing is that one of the most desired features for a compiler is that it is written in itself (because when you are there your toolchain becomes much simpler), and to my knowledge this is where most new languages (for which you need compilers) want to go, and usually do. So by implication modern tooling are written in that particular language, and therefore again usually not usable for other languages.
I would suggest you look into the new Roslyn compiler for C# because it is probably the newest really large scale open source compiler right now. Then you can see what their toolchain is today. (I have not looked at it).
So, what then if you just want to throw together a small, simple compiler. The traditional way is to use the Unix toolchain as reimplemented by the GNU project and widely available in Linux distributions, but you explicitly said you didn't want that.
Therefore I would suggest that you consider a modern Lisp dialect. This allows you to specify programs as Lisp-structures which can be directly read into memory (which is the same reason that JSON originally became popular) allowing you to start with the tokenized program instead of having to write a parser, which is a nice head start. Also Lisp programming might be quite a learning experience, as mentioned by Eric Raymond in http://www.catb.org/~esr/faqs/hacker-howto.html#skills1 . I looked at Scheme a while back and liked it, but I would suggest experimenting a bit to find the one you like the most. Look for a good debugger.