You only need multiple Titans if you insist on 4K ultra with everything turned up all the way. Consoles typically don't do that, it's always a graphical compromise to make the hardware work at some reasonable level of graphical fidelity. And typically they only target a 30fps framerate anyway.
I have to make the same compromises myself - some settings hurt much more than others as you increase the resolution, but you can find some balance that looks good. Anything that can't be played at 4K can be played at 1440p High or 1080p Ultra and upscaled. 4K is pixel-dense enough that some destructive resizing from 1440p isn't a big deal, or 1080p scales perfectly 4:1. Right now most consoles are doing medium-quality 720p and scaling up the output, so ultra-1080p is still a big step forward compared to a console.
Right now the Playstation 4 has 1152 GCN 1.0 cores, so it's between the performance of a Radeon HD 7850 (1024 GCN 1.0 cores) and the 7870 (1280 cores). Something in the 390 or 970-class would probably be reasonable, that's about 80-90% faster. Maybe 980 or 390X-class on the high end, which are 100-110% faster. And remember that on consoles you get the benefit of a single hardware set that you can optimize for, and play low-level micro-optimization tricks with (GCN is great for this).
Nobody knows where the new Polaris/Pascal chips will fall on the scale, but that performance tier will probably be considered a midrange chip at that point.
Do I think Sony will invest that much in a premium hardware configuration? I have my doubts. But it's certainly possible to make an engineered solution that will play 4K@30fps reasonably well inside a console footprint, particularly once we get the node shrink from Polaris/Pascal.
Plus, it's not even talked about in the article. What motivated your comment?