If you take a 8K camera with a standard 50 mm lens, its angular resolution is about 20" / pixel.
A 50 mm lens has a FOV of about 40°. It covers a cone of about 0.38 strad. A full hemisphere has 2·pi = 6.28 strad, so we need at least 16.5 such cones to cover the whole area; actually we need likely 20-25 because of imperfect geometry and some safety margins at intersections. We can, of course, mount fewer and scan.
If we take a plane like A320 (larger than a typical fighter jet), and remove it 25 km from us, its angular size would be about about 5', or 300". Our A320 would be 15 pixels wide, assuming very good optics, and very clear skies. This is not much to determine what craft is approaching us. At the cruise speed of 800 km/h, or 220 m/s, the plane will reach us in 122 s, or less than 2 minutes. Not a lot of warning. A fighter jet making 500 m/s would be there in 50 s.
This is, of course, without any clouds. Even very light clouds or haze would conceal the aircraft at 25 km. To say nothing of the night time.
We could of course take in IR camera, but I don't remember 8K IR cameras being cheap, or even available. A stealth aircraft like B-2 does a lot to make its thermal signature very faint, including the exhaust.
Blurry video is fine when using techniques like this: https://www.youtube.com/watch?v=m-b51C82-UE
Clouds also don't save you (unless you have two thick layers to fly through) because this technique is even easier with satellites. Stealth effectively no longer exists for most nation-state level tech. The B-2 is a very cool plane but is unfortunately obsolete. Still great for when you want to put on more of a show than an attack.
This is incorrect. A typical satellite will orbit once every 100 minutes or so (military spy satellites more often because they fly lower, but that only makes the next part even worse). To have any kind of resolution the swath it can scan is very narrow. It'll pass from horizon to horizon in some 10-14 minutes or so, if if passes reasonably overhead (which it'll do once, the next orbit it'll be far from overhead or not seen at all, depending on your latitude).
For a satellite to spot an airplane you need to be in luck. A coincidence. It's not something you can use for spotting airplanes. The harder you look (the more you increase resolution) the more narrow the swath gets. You can have more satellites. There's still no chance of actively detecting airplanes on a regular basis. And this doesn't even take into consideration that the data must be processed after having been dumped from the satellite. The satellite is by then elsewhere.
You could use a geostationary satellite, to monitor a good third of the planet at once. But then you're nearly 36000km above equator and you can't see any details. So, not that either.
Satellites are great for scanning the surface of the planet. And for that we're now at a stage where it's hard to hide anything, for very long at least. But moving airplanes is something entirely different.
(My job is about processing data from satellites).
Constellations (like Star Shield or the Chinese equivalent) solve this problem; there is always a dozens of satellites overhead, and they don't need a lot of resolving power to detect contrails, I vet even cubesats with repurposed phone-camera sensors would suffice.
Yes, but no. "Infrared" is a very wide range, and you're talking about different things here.
"Just off the red end of the visible spectrum", aka near-IR is typically considered 700-1400nm, which is what your normal visible-spectrum camera becomes sensitive to when you pop off the filter. That's fun, and you'll find lots of cool things there. Remote controls use near-IR typically in 850nm, flowers often reflect vividly in this range too. Notably, near-IR passes through most materials that're clear in the visible spectrum, which is how bugs are able to have eyes that see it to locate those flowers. Also, plastics and glasses, so NIR-capable optics are cheap. (NIR windows are often dyed to look dark-purple or black to the visible spectrum, because these dyes are transparent to IR.)
However, "thermal infrared" is much longer -- rocket exhaust can be seen with a mediumwave-IR sensor in the 3000-8000nm range, but warm bodies only start to show up in longwave-IR, 8000-15000nm. Sensors for those are mindbogglingly harder to make than near-IR. And these wavelengths don't pass through normal materials. Plain old glass, for instance, is totally opaque to thermal wavelengths, so if you take a thermal picture (a thermograph) of a window, you don't see the warm bodies inside, but rather the temperature of the glass itself, combined with whatever outside objects are reflected off its surface -- it acts like a mirror, not a window.
This means making lenses for thermal cameras is also difficult and expensive. The materials are awful -- Zinc Selenide is one of the most common, despite being expensive and toxic so it's difficult to machine. Pure Germanium works, but it's even more expensive. Sodium Chloride is amusingly transparent to LWIR but it tends to dislike getting rained on.
Removing the hot-mirror from a visible-light camera is a neat party trick and does legitimately see into "the infrared", but that's not the same as thermal infrared, which does still require specialized equipment.
It's not the same IR you'd need for military purposes, you're seeing a tiny percentage of the IR spectrum, and it's the most useless. That's why any semi decent military tier system costs 50k+ a pop. Civilian tier scopes are 1k+ and pretty shit, you won't see anything smaller than a human more than 1km away.
I mean, yes, you can take a low-noise sensor, add cooling, add a telescope lens so that you'd see the shape more readily, put a bunch of these telescopes on a rotating platform to scan the sky, etc. This is doable, but the thread started with an idea that it's doable with consumer-grade ("cheap") tech. I doubt that.
While at it, even if we assume that stealth does not exist for fast and heavy aircraft, it seems to effectively exist for slow, lighter-weight drones. Ukrainian drones, built from ultralight aircraft like Aeroprakt A-20, somehow penetrate 700 miles into Russian territory to burn refineries. With a cruise speed of 70 mph (sic), it should take them 10 hours to fly this distance. Were they detected efficiently, that would be enough time to scramble an interceptor a hundred times. Apparently this does not happen.
700 miles is far more than the standard range of the A-20 (210nm). Is it possible they launched it from well within Russia thereby making it much less likely to be considered a threat?
Are there reliable ways to figure out which pixels on subtracted frames are cloud movement and which are the few ones that are an aircraft?
Correct me if I'm wrong, but adding extra cameras doesn't seem to solve this problem, each source reporting "almost everything moves" would make solving intersections and tracking them impossible, because each target candidate can be assigned to many changes pixels in consecutive frames. Unless some additional pattern detection is done, but again it's hard for very small objects.
Why do you have to do that exactly? Aircraft identification and aircraft detection are very different tasks. For detection, you need a tiny fractional difference in illumination (<1%) of one single pixel, that persists over time, and which shows up on two or three cameras separated for parallax.
The Youtube channel Consistently Inconsistent has been doing a series on optical detection, after an offhand Elon Musk comment.
https://www.youtube.com/watch?v=m-b51C82-UE
https://www.youtube.com/watch?v=zFiubdrJqqI
https://www.youtube.com/watch?v=YZkLQsv3huo
Anything detected can be subjected to closer inspection with radars, optical telescopes, and infrared telescopes.
Anyway. Combine with microphone arrays, and your coverage is better.