The terrible thing about so-called moral choice systems that work off of binary decisions is that they're designed to make it seem like moral choices are ambiguous or cloudy, when they're not.
The following assumption, for example, is false: "and that means Yuri has to die."
The very simple solution is to contact Yuri and inform him that if he continues to try and get the presidential candidate arrested, that members of the CIA will attempt to assassinate him and whether they do so or not is entirely out of your control. Because, the reality is, even if you refuse to kill him to think that the U.S. would be unable to respond in any other way is absurd.
He can drop his complaints and live, or continue and risk death. The moral choice goes from being a conundrum for you (when by all rights it should have nothing to do with you), to a conundrum for him (where he gets to decided whether the risks warrant the possible rewards).
It was never your decision to make. Your bosses made their choice, and he makes his choice. You don't have a choice because whether or not you make one shouldn't actually change a damn thing.
(If you're looking for a more interesting and realistic model of moral choice, that would be it)