hand job
American
[hand job]
/ ˈhænd ˌdʒɒb /
noun
Slang: Usually Vulgar.
Usage
What does hand job mean? A hand job is a sexual act in which one person uses their hand to stimulate another person's genitals.Content warning: the following content includes references to sexual activity.How do you pronounce hand job?[ hand job ]
Etymology
Origin of hand job
First recorded in 1935–40
Definitions and idiom definitions from Dictionary.com Unabridged, based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
Idioms from The American Heritage® Idioms Dictionary copyright © 2002, 2001, 1995 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company.