I'm an American but I think the Germans should have More of a say in the world because They have great possabilities and If there is a world War then Germany will be the center of it in Europe. The Germans Would Be One of our greatest Allies If we just let them. If I strike a nerve I'm sorry.