Is it time to give up on the "leftist" label?

To be clear up-front, I'm not saying I'm sure we necessarily should! I actually don't know. It's just something I've been pondering.

That said, it occurs to me that the term "left" has become inextricably entangled with things like identity politics that aren't really specifically related to anti-capitalism and anti-imperialism one way or another and even with entities like the Democratic Party that are, in reality, actively hostile to leftism. At this point, as much as I wish it were otherwise, it seems like the term "left" is so co-opted that it's become virtually useless as a term to communicate one's political views.

I mean, if tell someone I'm a "leftist," I pretty much immediately have to explain what I mean by that. Otherwise, they're as likely as anything to assume I mean I'm a Democrat (I despite the Democratic Party and its warmongering, imperialism and neoliberalism) who's an extreme progressive socially (actually, I'd count more as a social moderate in today's society). And if I have to do all of that explaining, what good was the term "left" even doing me? I could have just skipped straight to the explanation.

I guess my point is that while "left" may accurately describe my political position based on the original meaning of the term and based on its usage through most of its existence as as political term, I'm just not convinced it's of much use anymore in today's world. Has anyone else had thoughts like this, or am I just crazy over here?