Tango's extra hardware was for depth sensing, and low-energy feature tracking, but the basic technique of plane detection from what I understand, is the same technique ARKit uses, which came from Flyby according to one article I read. In 2014 when Tango was released, even Apple HW wasn't powerful enough to run the tracking in software alone.
I have a feeling the end game of this is going to be that Tango-like devices are used for mapping the world, and ARKit like libraries consume the geometry.
That is, not everyone has to own a device with depth sensing. For example, if Streetview-like services using LIDAR, or if self driving cars with LIDAR, map point clouds of most outside areas, and merchants and vendors map into areas with specialized Tango-like HW, then most of the benefits of depth sensing AR can be had for people without depth sensors.
It would be a mostly static 3D map of the world, not frequently updated, but probably good enough to enable a large number of apps.