The true alliance of Japan and Germany would only come about when Japan entered the war. When Japan attacked Pearl Harbor and other American bases, it led to America declaring war on the Imperial nation. In response, Germany declared war on America, and thus further strengthened their relationship with Japan.Jan 4, 2019