I have heard several ppl recently claim that all white Americans are privileged from slavery.
I very much doubt that there is any white American alive today who has benefited from slavery. Slavery was aboloshed 150 years ago. Before the Civil War, only a small minority of whites owned slaves. Poor whites probably suffered reduced wages as a result of labor market competition from slaves. Slaveowners lost their slaves, without compensation. Most of the wealth of the South was destroyed by the war and Reconstruction. There is no remaining wealth from slavery, and there has not been for a very long time.
Having the descendants of African slaves living in the USA may have been of some economic advantage to some whites, but that has to be balanced against the economic harm, such as the decline of Detroit and Chicago. I will defer to economists who might have analyzed this, but I very much doubt that there is any net benefit.
On the other hand, American blacks appear to have benefited from slavery. Most of them are better off than their cousins in Africa. Even during American slavery before 1860, it is still likely that the black slaves were better off in the USA. Many of them would have been slaves in Africa. The American slaves probably had a higher standard of living, and maybe even more personal freedom, than their African counterparts.
I could be wrong about some of this, but I listen to black scholars on sources like NPR Radio, and they do not appear to dispute any of this. They have complaints, like an occasional use of the N-word, but no substantive argument for white privilege resulting from slavery.