In theory, If Quint was programmed with the "zeroth law" of robotics, which states "A robot may not harm humanity, or, by inaction, allow humanity to come to harm." He would still be able to kill Dr. Wily within the tenets of the laws of robotics, as the zeroth law does not carry an exception like the second and third laws do. to give an example, the second law states: "A robot must obey the orders given it by human beings except where such orders would conflict with the First Law." Take special note of the exception that states "except where such orders would conflict with the first law." Which, as I have stated earlier the zeroth law does not possess. So, in theory. Killing a single human to ensure the safety of all other humans on the planet would not violate the tenets of the "four laws" if the safety of humanity as a whole takes priority over the safety of a single human. Such as the "trolley problem" where one must select whether a single person dies or whether five people die. A robot who was shackled to the "four laws" as I have outlined them would choose to sacrifice the single person, to limit the loss of life, as taking no choice violates the first law. Or maybe I'm overthinking this... Damn, I need a beer.