If you are going to apply the Borel-Cantelli lemma, that means you expect to show that the sum of all the chances of the events $Y_i\ne X_i$ is finite. Somehow this must be derivable from the assumptions--and about the only useful assumption available is that the common distribution has finite mean. What is the connection between these statements?
Notice that $Y_i \ne X_i$ is equivalent to $|X_i| \ge i.$ There's no problem working with $|X_i|$ instead of $X_i$ because
The independence of the $X_i$ implies the independence of the $|X_i|.$
Because the $X_i$ are identically distributed, the $|X_i|$ are identically distributed.
The mean of $X_i$ is defined and finite if and only if the mean of $|X_i|$ is defined and finite.
What's the connection between the mean of a variable and the chances that it's large? The answer lies in the "tail probability expectation formula" (see the reference or consult Expectation of a function of a random variable from CDF),
$$\mathbb{E}(|X_j|) = \int_0^\infty \Pr(|X_j| \gt x)\,\mathrm{d}x$$
for any $j.$
To relate this to $\Pr(|X_i|\ge i)$ we can break this integral into pieces at the integers $i=1,2,3,\ldots$ and underestimate its argument a little in each piece because the $X_i$ are identically distributed. Specifically, for any $x\lt i,$
$$\Pr(|X_j| \gt x) \ge \Pr(|X_j| \ge i) = \Pr(X_i \ge i).$$
This is the key step in the following derivation, which begins with observation $(3)$ that $|X_j|$ has finite expectation:
$$\eqalign{
\infty \gt \int_0^\infty \Pr(|X_j| \gt x)\,\mathrm{d}x &= \sum_{i=1}^\infty \int_{i-1}^i \Pr(|X_j| \gt x)\,\mathrm{d}x \\
&\ge\sum_{i=1}^\infty \int_{i-1}^i \Pr(|X_i| \ge i)\,\mathrm{d}x \\
&= \sum_{i=1}^\infty \Pr(|X_i| \ge i)\,\int_{i-1}^i \mathrm{d}x \\
&= \sum_{i=1}^\infty \Pr(|X_i| \ge i).
}$$
That's the condition for applying Borel-Cantelli, QED.
Reference
Ambrose Lo, Demystifying the Integrated Tail Probability Expectation Formula. The American Statistician Volume 73, Number 4, November 2019, pp 367-374.