The first law: robots must not harm humans, nor can they stand by while humans are harmed; the second law: robots must obey human orders, unless the orders violate the first law; the third law: robots must protect yourself, unless you violate the first and second laws in the process of self-protection
As for why you proposed it, isn't it because you want to rule the robots and fear that the robots will rebel. Protect humans themselves.