Dictionary.com
Thesaurus.com

hand job

American  
[hand job] / ˈhænd ˌdʒɒb /

noun

Slang: Usually Vulgar.
  1. an act of masturbation, especially as performed by a sexual partner on a man.


Usage

What does hand job mean? A hand job is a sexual act in which one person uses their hand to stimulate another person's genitals.Content warning: the following content includes references to sexual activity.How do you pronounce hand job?[ hand job ]

Etymology

Origin of hand job

First recorded in 1935–40