Background
I have a project that depends on the usage of a certain type of hardware device, while it doesn't really matter who makes that hardware device as long as it does what I need it to do. With that being said, even two devices that are supposed to do the same thing will have differences when they are not made by the same manufacturer. So I am thinking to use an interface to decouple the application from the particular make/model of device involved, and instead have the interface just cover the highest-level functionality. Here is what I am thinking my architecture will look like:
- Define an interface in one C# project
IDevice
. - Have a concrete in a library defined in another C# project, that will be used to represent the device.
- Have the concrete device implement the
IDevice
interface. - The
IDevice
interface might have methods likeGetMeasurement
orSetRange
. - Make the application have knowledge about the concrete, and pass the concrete to the application code that utilizes (not implements) the
IDevice
device.
I am pretty sure that this is the right way to go about it, because then I will be able to change out what device is being used without affecting the application (which seems to happen sometimes). In other words, it won't matter how the implementations of GetMeasurement
or SetRange
actually work through the concrete (as may differ among manufacturers of the device).
The only doubt in my mind, is that now both the application, and the device's concrete class both depend on the library that contains the IDevice
interface. But, is that a bad thing?
I also don't see how the application won't need to know about the device, unless the device and IDevice
are in the same namespace.
Question
Does this seem like the right approach for implementing an interface to decouple the dependency between my application and the device that it uses?