I would have assumed something like this:
... the physics means that you have to be very close or get so little power that it's not really useful.
For instance what if I'm trying to charge my phone remotely and something (or someone) gets in between the charger and the phone? Does that object/person heat up? Does the phone stop charging?
Exactly. Power transfer over any distance will either be very lossy or very directional, or both.
Existing WiFi signals have to be low enough in power that you can't get burned by the aerials (with a good safety margin) but communication works because the receivers can work on such small radio frequency power.
To get enough power to keep a mobile phone running on standby, it would either have to be really close to a WiFi device or the radio power would be too high to be safe.
Depending on the frequency used, radio power can be directed. That's how radar measures the distance in a particular direction. The antenna is then rotated to read the distance in all directions, and a 2D map appears as if by magic. Modern radar rotates the beam using lots of fixed aerials and clever electronics.
People have tried using similar devices to direct power at a phone or other device that needs charging. While that could improve the distance / efficiency trade-off, I still think that it would be difficult to get enough power across a room to be useful, while being safe. It would also need very expensive electronics and would be very inefficient, and would only work in line-of-site. If wireless charging of a phone achieved 5 W of charging, at 1% efficiency, putting all the UK phones on charge at the same time would approximately double the UK electrical power consumption.
I think that a lot of people who suggest these things don't see the difference between WiFi getting full signal across a room, where 99.99% of the power is lost, and a wireless charger, which only loses 50% of the power but needs the phone within millimetres of the transmitter, and the correct orientation.