German Empire
Not exactly, it worsen the Relations between Germany and United States. The United States declared War on Germany at the ending of 1917.
~History Animation~
They stopped doing it until later on when thy got desperate or smthng innit-
Oh yeh-