What is "the west" and "western culture"

I hear ISIS and ISlam and AL Qaeda (all different organizations of course) are at war with the west or western ideals. What does that mean?

Western hemisphere?


What is "the west" and "western culture"