I was thinking of culture, rather than transient political alliance. Hasn't Germany been fundamentally European, hence "Western", at least since 1871?
Granted, these are very loose terms, but their use in discussions of "Westernizing" other countries is equally loose. The context in which the term "Western" was invoked a few posts back appeared to characterize a cultural and economic identity rather than a political alliance. If that interpretation is incorrect I'm sure the author of the post will clarify.
Bookmarks