Definitions | western |
| noun
- A film, or some other dramatic work, set in, the historic American West (west of the Mississippi river) typically focusing on a Cowboys vs. Indians conflict (real or imaginary.)
adjective
- Of, facing, situated in, or related to the west
- the western approaches
- (of a wind) blowing from the west; westerly
- (Capitalised) Of, situated in, or related to the West
- Western democracy
- occidental
Supplemental Details:Sponsor an extended definition for western for as little as $10 per month. Click here to contact us.
| |
|