West Coast of the United States

The West Coast (or Pacific Coast) is the term for the westernmost coastal states of the United States. The term most often refers to the states of California, Oregon and Washington.