Two quick examples that aren't contrived:
When $\langle X_i\rangle_{i\in\{1,~\ldots~,~n\}}\sim\mathrm U[0, \theta),~\theta> 0. $
It's an easy exercise to check the $n$th–order statistic is the mle and yet the derivative of the likelihood cannot be zero there.
When $\langle X_i\rangle_{i\in\{1,~\ldots~,~n\}}\sim\mathrm{Lap}(\theta), ~\theta>0.$
A unique mle exists at the median of the sample when $n$ is odd. But again the log likelihood is not differentiable at any of the sample variables.
For a situation where one encounters a likelihood having a stationary point and yet it fails to turn out to be an mle, consider this location family
$$f(x\mid \theta)=\begin{cases}\frac{1}{4\exp(\vert x\vert)\sqrt{\pi\vert x\vert}}, & x\in(-\infty,0]\\ \frac{1}{2\pi\sqrt{x(1-x)}},& x\in (0,1)\\\frac{1}{4\exp{(x-1)}\sqrt{\pi(x-1)}},&x\in(1,\infty)\\0,&x\in\{0,1\}\end{cases};$$
though apparently intimidating, the construction of the density function was based on the objective that there would be a local minimum in $(0,1)$ and the graph in $(-\infty, 0)$ and $(1, \infty) $ would be mirror image of each other.
When $n=1, $ the likelihood function is same as the density. Hence even though a stationary point exists, it is a local minimum and hence no mle exists.
![enter image description here](https://cdn.statically.io/img/i.sstatic.net/6ni2g2BMm.png)
--
Reference:
$\rm[I]$ Counterexamples in Probability and Statistics, Joseph P. Romano, Andrew F. Siegel, Wadsworth, $1986, ~8.14, 8.16, 8.17.$