Definitions | Wild West |
| proper noun wild, Wild west, West
- The western United States during the relatively lawless 19th-century era of settlement.
- (context, by extension) A place or situation in which disorderly behavior prevails, especially due to a lack of regulatory oversight or an inadequate legal system.
- The CEO commented that the Russian business environment of the 1990s was the .
Supplemental Details:Sponsor an extended definition for Wild West for as little as $10 per month. Click here to contact us.
| |
|