Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But from a photography perspective, how does any of that matter?


Altitude, atmospheric effects, and relative angular velocity are all factors in photography. Imaging from orbital platforms is also cheaper than airborne reconnaissance (per square meter, although the up-front capital investment is greater), covers a wider variety of purposes, and you don't have to worry about airspace violations; however, it may provide poorer definition, especially the more affordable commercial satellite imagery, and cannot compensate for cloud cover. So the distinction is significant on technical, operational, financial, and political levels.


Compensation for cloud cover in the visible spectrum is achieved by just picking an image from some of the next satellite passes. Also, there are active illumination imaging instruments (e.g. SAR) that can penetrate through clouds and see at night.

Atmospheric correction however is really an issue and often results in distinct patches on the "satellite" view


Well, with the same camera, you get 100 times higher resolution from 1 km than from 100 km. But a satellite in a polar orbit overflies the whole Earth, typically, every few weeks (though keyhole-type satellites only photograph a very narrow track) while in many cases the only available aerial imagery is years or decades old. And a satellite can fly over restricted airspace (the only way to stop it from doing so, even for its owner, would be to blow it out of orbit) while doing that in an airplane is likely to get you thoroughly murdered and possibly result in a diplomatic incident.

The result is that satellite photographs are much more frequent and have much better coverage, while aerial photographs have much higher resolution. The dishonest naming of the Google Maps feature has given people extremely unrealistic expectations of what satellites can do, which results in difficulty in selling actual satellite photography products when they don't match what people have come to expect from GMaps.


is this still true tho? can't they just stick a big telescope lens on a satellite and get pretty detailed imagery?


You can, and satellite optics typically are a lot bigger than aerial photography optics, but the wavelength of light and the sizes of satellites you can afford to launch still impose a practical limit. For US companies, laws impose another limit.


You can and there were / are 7-8 Hubble sized telescopes in orbit, with somewhat other optics and sensors looking in the other direction. Most likely the same is true for siblings of JWT.


There is a whole lot of pesky athmospheric interference limiting how much detail you can actually resolve.


The further you get from the object you're photographing, the closer your photo gets to an orthographic projection instead of a perspective projection.


The first incarnation of Google maps used low resolution Landsat imagery for most of the US. Massachusetts stood out distinctly with a different color palette because they had a public dataset of higher quality aerial imagery for the whole state.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: