Problem: Let $A$ and $B$ be two real matrices of size $n \geq 2$ such that
\[
A^{-1} + B ^{-1} = (A + B)^{-1} .
\]
Prove that $\det A = \det B$.
Solution: We have
\[
A^{-1} + B^{-1} = (A + B) ^{-1} .
\]
Multiplying the above equation by $A + B$, we get
\begin{align*}
A ^{-1} + B^{-1} = (A + B) ^{-1} & \implies (A + B) \left( A^{-1} + B^{-1} \right) = I \\
& \implies A A ^{-1} + AB^{-1} + BA^{-1} + BB^{-1} = I \\
& \implies AB^{-1} + BA^{-1} + I = 0 \\
& \implies X + X^{-1} + I = 0 \tag{$\star$}
\end{align*}
where $X = AB^{-1} $. Multiplying $(\star)$ by $(X - I) X$, we get
\begin{align*}
& (X - I) X \cdot \left( X + X^{-1} + I \right) = 0 \\
\implies & (X - I) (X^2 + I + X) = 0 \\
\implies & X^3 + X + X^2 - X^2 - I - X = 0 \\
\implies & X^3 - I = 0.
\end{align*}
Thus,
\[
\det X^3 = \det I = 1 \implies \det X = 1.
\]
Note that we obtained $\det X = 1$ because $A$ and $B$ are real matrices and so is $X$. Thus,
\begin{align*}
\det (AB ^{-1} ) = 1 \implies \det A = \det B.
\end{align*}
Note that we have explicitly used that the matrix should be real. If we take complex matrices $A$ and $B$, then it is not true. For example, take
\[
A = \begin{bmatrix}
1 & 0 \\
0 & 1 \\
\end{bmatrix}, \quad \text{ and }\quad
B = \begin{bmatrix}
\omega & 0 \\
0 & \omega \\
\end{bmatrix},
\]
where $\omega $ is the cube root of unity, that is, $\omega = \frac{-1 + \sqrt{3} \iota }{2}$. Then,
\begin{align*}
A^{-1} + B^{-1} & = I + \begin{bmatrix}
\omega ^{-1} & 0 \\
0 & \omega ^{-1} \\
\end{bmatrix} \\
& = I + \begin{bmatrix}
\omega ^2 & 0 \\
0 & \omega ^2 \\
\end{bmatrix} \\
& = \begin{bmatrix}
1 + \omega^2 & 0 \\
0 & 1 + \omega ^2 \\
\end{bmatrix}.
\end{align*}
On the other hand,
\begin{align*}
(A + B) ^{-1} & = \begin{bmatrix}
1 + \omega & 0 \\
0 & 1 + \omega \\
\end{bmatrix}^{-1} \\
& = \begin{bmatrix}
(1 + \omega )^{-1} & 0 \\
0 & (1 + \omega )^{-1} \\
\end{bmatrix} \\
& = \begin{bmatrix}
(1 + \omega^2 ) & 0 \\
0 & (1 + \omega^2 ) \\
\end{bmatrix},
\end{align*}
where the last equality holds because $1 + \omega + \omega ^2 = 0$. More precisely,
\begin{align*}
(1 + \omega )(1 + \omega^2 ) = 1 + \omega ^2 + \omega + \omega ^3 = 0 + \omega ^3 = 1.
\end{align*}