They can and they will.
Some people who know a lot about current limitations seem to doubt this in their answers here, but they are wrong.
As an example, look at the recently released Apple M1 chip, which has about the same performance as top-of-the-line Intel and AMD CPUs, but consumes a fraction of the power and accordingly produces a fraction of the heat.
Chips are getting more power efficient rapidly at the moment, which also means, they can be stacked in more layers and become far more dense than currently.
Samsung SSDs store multiple bits in one memory cell, by storing different voltage levels or magnetic values. 8 discrete values make for three bits. 16 is probably to come. This alone could increase computational density of a chip by a factor of 4 without adding heat or other issues, if applied to CPUs, not just memory banks.
Quantum circuits will probably soon become part of standard chips. They will take over some of the processes, not all, but in those areas, an increase by a factor in the multiple thousands is likely. Repetitive simple calculations like graphics should profit the most, giving us hyper realistic 3D interfaces and worlds without lag.
Multiple other technological improvements are at various stages of development. Even if only some of them are successful, they will boost chip performance tremendously.
As it is, I expect computer performance to keep falling behind supercomputer performance. The years between the top super computer and an equally powerful personal computer will continue to increase, but not too fast. So in around 15 years, we’ll probably have gaming computers with as much power as todays top super computer.