I disagree. Japan is frequently referred to as part of "the West" (despite obvious geographical issues). They don't fit this "derived from" idea (which itself seems a suspect concept)
I would rather see "western" culture as lead by democracy, freedom, justice and fairness for all under rule of law. It seems better to have a culture where we have a shared destination to strive towards rather than a common ancestry
Japan is also frequently considered not to be part of "the West", I don't what proportion of people think which way though. I'd never consider them included, and in discussing either marketing or sales in business I've never heard them included as part of the West.