The blue grid in the following Curvature App shows the curvature of the earth at a certain altitude as it appears at a certain Angle of view. The Angle of View, or Field of View (FoV), can also be set as a 35 mm focal length.
Click a Button to Start a Demo. Click again to skip one step. Click the Animation to start/stop. See also Controls below.
Please read the paragraph on Refraction to get familiar with this panel.
Use this Formular to convert between different lengh units. You can Copy/Paste the results into input fields in the other Forms.
Get App State Get App Url Set App State Compact Clear
Use this panel to save a certain App state by Get App State and use Copy/Paste to save the state in an external text file. Use Copy/Paste to copy a saved state from an external text file into this panel and click Set App State to activate this state. You can change the Parameters in this panel and apply them by pressing the RETURN button on the keyboard.
Use Get App Url to get an URL containing the current App State. Click Set App State oder copy the URL into any browser address field to go to this page and display the current App State.
Perspective on a Globe
if you have trouble understanding how perspective works on a Globe.
Comparison of Globe and FlatEarth Model Predictions with Reality
to see a side by side comparison of the model predictions with the corresponding real images.
If you activate the option Show Data then there will be 3 Lift values displayed for an object: Absolute Lift of the nearest Object, Lift of the Horizon and Lift of the Object relative to the Horizon. The Lift values express the change in a scene due to Refraction. The values are measures as they appear at the distance of the nearest object in the scene. So if the Horizon Lift is 10 m, that means that the horizon appears to be lifted by 10 m as displayed on a measuring rod at the position of the nearest object.
The Lift values are computed from viewing angles to the base of the nearest object and the horizon line for the cases of zero Refraction and the Refraction entered in Refraction Coeff. k. These angles are then projected onto the plane a the distance from the observer eye to the base of the object in such a way, that if the object is a measuring rod, the lift values can be read from the measuring rod.
If you are looking from above onto a MRod object, so that it looks compressed in the vertical direction, the lift values are computed accordingly so the compressed rod can still be used to measure this values. So if a lift value is 5 m and a 10 m rod appears compressed to half its size, you can still use this compressed rod to measure the 5 m, although the lift value as it appears on a measuring rod at the position of the object on a plane perpendicular to the line of sight would only be 2.5 m.
A Lift value always corresponds to the value as read from the (compressed) scale of a MRod object at the position of the nearest object.
The simulation parameters are grouped into the panels Views, Objects 1, Objects 2, and Refraction.
How to use URL parameters to control the App
In the Views Panel you can choose between the Globe Model, the Flat Earth Model or a side by side view of both. On the Flat Earth Model you can choose between 2 Perspectives.
The Curvature App can be used to compare a spherical earth (Globe) with the FlatEarth. You can select the desired model with the radio buttons labeled Model:
If the Flat Earth model is selected you can choose 2 different "Perspective models" for the Flat Earth, although there exists only one single law of perspective in reality:
The transformation T is such, that all Zcoordinates (vertical direction) of the 3D model are changed to linearly converge to eye level at the vanishing point (VP) distance in all directions. This corresponds to the side view Flat Earthers draw, where all objects and lines converge to a single VP. All lines that pass the vanishing distance are clipped. Objects behind the VP are not drawn.
Note: if the VP is placed infinitely far away, like it is in the real law of Perspective, then the ZCoordinates would keep their values and the result is real Perspective.
The distance of the VP in the App is calculated such that it is the same as the Globe horizon distance, dependent on the observers altitude, plus the value of the VP Dist slider. This way the VP distance increases automatically with increasing altitude of the observer.
Note, that T only scales the vertical size of objects. The horizontal size remains fixed and only shrinks on the screen with increasing distance of an object due to the normal law of perspective. If the horizontal sizes were also scaled by T, then the gaps bewteen objects would get bigger when they move on parallel tracks into the distance, which can not be observed in reality. So T is not an affine transformation and has nothing to do with the law of perspective used in drawings and computer graphics. It is implemented for educational purposes only.
There are situations, like boats or bridges over water, where Flerspective seems to match observations: objects near eye level in the distance disappear, similar as on the Globe model, where they get hidden buttom first by the horizon. But zooming in on the horizon shows that Flerspective does not match observations. Most observations look very different than what Flerspective produces.
Flat Earthers may object that I did not implement the right kind of Perspective. But no Flat Earther is able to give me a precise mathematical description of Flerspective that could be implemented in code. Even in my version of Flerspective I had to fudge the numbers to match observations as good as possible. This is not necessary on the normal law of Perspective together with the Globe model.
For comparison with the FlatEarth model, a red grid can be displayed with the Grid setting Projected. The red grid shows the projection of the blue grid onto the plane of the FlatEarth. For low altitudes, the deviations between the blue ball grid and the red flat grid are minimal. So small indeed that by turning off the red grid the curvature can barely be noticed.
Note that the globe grid does not have a constant spacing. Instead, a certain number of grid lines are displayed, adjustable with the parameter Lines. This corresponds to the natural seeing, because on the earth we have no fixed grid which shows the relative distances either. As a result the distance between the grid lines varies with the distance to the GlobeEarth horizon. The actual distance between the lines is displayed at GridSpacing under Computed Values.
You can choose with CameraAim which point the camera is aiming (Globe Horizon or FlatEarth Equator FEEq) or which reference line (Betwn: line between GlobeHorizon and FEEquator, or EyeLvl) should stay at the center of the graphic, if Tilt = 0.
View∠ (View Angle θ) = Field of View FoV and f (Focal Length = Zoom) are linked together via the formula (see Angle of view):
(1) 
You can enter a View∠ between 0.1° and 160° or a focal length f between 3.81 mm and 24,800 mm. The range of the sliders is narrower.
EyeLevel shows a line at infinity that is the distance Height obove the FlatEarth plane at the observer. So this line is at the height of the eye of the observer.
Tangent shows the tangent line to the Globe horizon. This is handy to recognize small curvatures of the horizon.
Data displays diverse computed values on the graphics.
There are two identicallooking Object Panels. You can combine two different sets of objects in a simulation scene. Some objects of the same type may look different in the two panels, e.g. the TTower. Use Panel Objects 1 for object in the foreground, and Objects 2 for background objects. The objects can not be mixed in the distance because the drawing algorithm combines all the objects of a group. The representation of mixed objects is optically incorrect.
To display objects in the graphic, a value greater than 0 must be set at NObjects. The type of the object is selected in the lower area of the panel with ObjType. If then no object is visible, it is probably outside the visual range. Change the view area in the Views panel or move the objects with the other parameters in this panel into the view area.
Most of the parameters in this tab are selfexplanatory. Note that rows with the same background color are related to each other. With the SideVar radio buttons you can select the mode for lateral movement of the objects, and the slider SideVar can be used to set the magnitude of the displacement. The same applies to SizeVar.
All Refraction settings of are made in this panel. For a detailed description, see Refraction.
Here you can convert lengths into different units. Once a value is entered into a field, all other units are calculated. You can copy a value from one of these fields by copy/paste to a field in another panel. Press the Esc key (applies to all fields of all panels) to reset the field.
The data displayed by these fields can be looked up on the graphics displayed below the panel. The data not specified there are:
AngDiameter (angular diameter) is the angular measurement describing how large the GlobeEarth appears from the distance Height.
GridSpacing is the grid spacing of the blue grid of the globe representation. The distance can be specified in the Views panel with the Lines option.
DisplHorWidth specifies the horizontal distance between the black frame at the distance of the horizon. If the horizon is not curved and not rotated, this corresponds to the length of the horizon lying within the black frame. The calculation of the length of the effectively visible curved horizon line is too complicated to be packaged into a formula. But this value can be used as a good guess.
The Curvature App can simulate how Refraction affects the Globe Model. For this purpose, the desired Refraction can be adjusted in the Panel Refraction with the red slider. If the App applies Refraction, the corresponding value is displayed at the bottom of the graph. A Refraction of zero is not displayed.
Refraction of the App can be set A) by one of the parameters Coeff. k, Factor a, Radius R' or with the red slider. Or Refraction can be computed B) from the atmospheric parameters Pressure Press. P, Temperature Temp. T and TemperatureGradient dT/dh. In case B), Refraction is calculated as soon as the value in dT/dh is changed.
The parameters P, T and dT/dh can also be taken over from the StdAtmosphere Barometer at the lower part of the panel by choosing BaroLink other than off. The Barometer calculates the parameters for the StandardAtmosphere on the basis of the observer's altitude Height h.
Attention: Refraction simulation only makes sense below approx. 40 km altitude. When Refraction is coupled to the StandardAtmosphere (BaroLink = StdAtm), Refraction automatically decreases with increasing altitude. However, you can enter any values into the fields, but you may get unrealistic Refractions and/or TemperatureGradients dT/dh.
The density of the atmosphere generally decreases exponentially with increasing altitude. Any density change causes a refraction. If the density change is not abrupt but continuous as in the atmosphere, the light is not refracted but bent, but we call it Refraction anyway. Light is always bent in the direction of the higher density, and in the atmosphere that is usually downwards. This means that objects in the distance appear higher than with a straight line of sight. This effect increases with the distance of the observed object, since the light beam travels a larger distance.
Refraction is not a constant phenomenon. It depends strongly on the current atmospheric conditions along the light path and therefore fluctuates on the way to the observer. Since it is impossible to measure the actual refraction from the object to the observer, an average value is obtained which can be calculated from the atmospheric conditions at the observer's location, at least for shorter distances of only some km. But these values can be used for longer distances too, if there are similar conditions along the light path. The average value corresponds to a light beam following an arc with the constant radius R_{R}.
Further usefull readings with explanations and simulations of refractions:
Refraction can be expressed by various parameters which depend on each other:
The values 1 to 3 are directly linked to each other. As soon as one of these values is specified, the other two are calculated.
The RefractionCoefficient k is the ratio of the radius of the earth R to the radius of the light ray R_{R}:
(2) 
 
where^{'} 

If the light ray is not curved, its Radius R_{R} is infinite. This means that for a noncurved light ray the RefractionCoefficient is k = 0. If the light ray follows the earth's curvature, which is quite possible, then k = 1. The earth appears completely flat in this case.
A standard value of k = 0.13 is often used in the survey. Another frequently used value assumes a light ray radius of R_{R} = 7 · R, which corresponds to a coefficient of k = 0.143 or a RefractionFactor a = 7/6. The difference is small: for an altitude determination on a distance of 1000 m the difference is only about 1 mm.
The RefractionCoefficient can be calculated from the atmospheric conditions as follows (Source: Atmospheric refraction):
(3) 
 
where^{'} 

Or if we know k and want to calculate the temperature gradient we can solve (3) for dT/dh:
(4) 

If we know the refraction index gradient or refractivity gradient we can calculate the refraction coefficient as follow:
(5) 
 
where^{'} 

Equation (3) can be derived from equation (5):
For StandardAtmosphere this results in a maximum value of approx. k = 0.17 which decreases continuously with increasing altitude of the observer and is practically zero at an altitude greater than 40 km.
The TemperatureGradient, i.e. the Temperature Change with increasing altitude, can fluctuate considerably near the surface. While a decrease of Temperature of 0.65°C per 100 m is established under StandardAtmosphere to an altitude of 11 km, i.e. dT/dh = −0.0065°C/m, a few meters above the surface very different values can be measured. Correspondingly, Refraction is then very different too.
Over cool water or ice, the TemperatureGradient dT/dh is often positive in a layer above the surface, i.e. the Temperature in the lowest layer of the atmosphere increases with increasing altitude. Such a condition is called an Inversion. If the temperature gradient is greater than −0.01°C/m, in particular in the case of an Inversion, the air is stable (stabile Inversion). If the temperature gradient is less than −0.01°C/m, which is the case with warm soil over cool air, compensating air flows emerge and the air is fluctuating, unstable.
On an Inversion, the down bending of the light beam is the most extreme and can be so strong that the light beam follows the curvature of the earth: k ≥ 1. In this case, the earth appears flat.
If the TemperatureGradient is more negative than Standard, that is, dT/dh < −0.0065°C/m, which often is the case over a warm surface with a layer of cool air, the light beam is bent less. Refraction k is then smaller than Standard. In the case of a very strong negative Gradient, when the ground is hot, the light beam can even be curved upwards, i.e. the RefractionCoefficient k is then negative. This results in a fatamorgana or mirage, so layers above the surface appear mirrored.
Note that even if the observer is at a higher elevation, where the ground effect at the observer is negligible, the light rays to distant objects can propagate a great distance along a cool surface like the sea, and are accordingly strongly curved. Therefore on observations over the sea, or a large lake, cities, islands, or mountains can appear, which, according to the formulas, which do not take Refraction into account, must be hidden behind the earth's curvature, see Animations Chicago and Canigou.
To get a feel for the impact of Refraction, I have assigned the following classification to the values:
Coefficient k  0 to 0.12  0.12 to 0.18  0.18 to 0.38  0.38 to 0.58  0.58 to 0.78  0.78 to 1 

Classification  weak  standard  moderate  strong  severe  extrem 
Correspondingly, I have assigned the following classification to the TemperatureGradient:
dT/dh  less than −0.01°C/m  −0.01 to 0°C/m  greater than 0°C/m 

Classification  instable Layer  stable Layer  stable Layer; Inversion 
Comment  warm surface, cold air  cold surface, warm air 
If a nonzero refraction is set, the value k and dT/dh and their classifications are displayed at the bottom of the graph. If Refraction is calculated from the values for StandardAtmosphere by setting BaroLink = StdAtm, this is indicated with the classification Standard Atmosphere.
If the surface temperature is colder than the overlying layer of the atmosphere, the air is very stable. Stable layers suppress convection and turbulent mixing of the air and thus retain their structure. In StandardAtmosphere, the TemperatureGradient is only −0.0065°C/m. It is therefore weakly stable.
Source: Atmospheric Temperature Profiles
The more positive the TemperatureGradient, i.e. the colder the surface is compared to the lowest layer of the atmosphere, the greater Refraction. This explains why, in laser experiments over a frozen lake, no curvature of the earth can be detected because the strong Refraction bends the laser light along the earth's curvature.
The RefractionCoefficient k can be calculated from the empirically found formula (3) from the current atmospheric conditions at the observer.
In the panel Refraction, the values for pressure P, temperature T and TemperatureGradient dT/dh for the StandardAtmosphere are displayed. These values are defined up to a height of approx. 85 km, from then they are displayed as NaN.
If you want to use these values to calculate Refraction, you can select the setting StdAtm with the option BaroLink. Then the barometer values of the standard atmosphere are linked to Refraction calculations. If you want to set a different TemperatureGradient but want to use Pressure and Temperature of the StandardAtmosphere, you can use the option T, P. With off the link is deactivated and you can use any values for Temperature and Pressure, even those that make no sense.
For the other BaroLink options, Temperature and Pressure are linked with the Baro values, but a fixed refraction can be selected. The corresponding TemperatureGradient is then calculated therefrom. Refraction can also be adjusted with the red slider. Note that these settings are useful only in the lower part of the atmosphere up to approx. 20 km, since Refraction decreases in nature with increasing altitude and does not remain constant.
In order to be able to use the formulas for the calculation of the obscuration of objects by the curvature of the earth also with consideration of Refraction, there is a trick: simply replace the radius of the earth R by an increased apparent radius of the earth R', which can be calculated from the RefractionCoefficient k. I denote the conversion factor as RefractionFactor a:
(6) 
 
where^{'} 

A value of a = 7/6 corresponds to Standard Refraction k = 0.14.
Note: in the literature the RefractionFactor is often denoted as K (big K). To avoid confusion with the Refraction Coefficient k I use a instead of K.
The RefractionFactor can be calculated from the refraction index gradient or refractivity gradient as follows:
(7) 
 
where^{'} 

A convenient method to analyse the effect of refraction on visibility is to consider an increased apparent radius of the earth R'. Under this model the light rays can be considered straight lines on an earth of increased radius.
(8) 
 
where^{'} 

If Refraction k is nonzero, the CurvatureApp uses R' instead of the radius of the earth R for the globe model to simulate the optical effect of Refraction.
Note that the apparent radius of the earth R' is not the radius of curvature of the light ray R_{R}. The relationship between the radii is:
(9) 
 
where^{'} 

We can convert between temperature gradients and refractivity gradients with the following equations:
(10) 
 
(11) 
 
where^{'} 

Similarly, as the size of an object can be expressed as an angular size α, the amount an object appears to be raised due to Refraction can be expressed as a RefractionAngle ρ. The magnitude of the elevation depends on Refraction k and the distance of the object from the observer. The further away an object is, the more it appears raised because the light beam is longer and thus is curved over a longer distance.
The angular size α of an object in degrees is given by its size s and its distance to the observer d. A good approximation for larger distances when d is practically equal to the view distance from the observer to the object is:
(12) 
 
where^{'} 

Note: if the distance d of the object is much larger than its size s, the approximation is accurate enough.
The accurate calculation of the RefractionAngle ρ is complex if we take the slanting of the object into account and is calculated by means of vector geometry. Essentially, the position of the highest point of the nearest object is calculated on a sphere with radius R and on a sphere with radius R'. Then a vector from the observer to each of these two points is calculated. The RefractionAngle ρ is the angle between these two vectors.
A simpler and good approximation for the Refraction Angle ρ is:
(13) 
 
where^{'} 

From the Refraction Angle we can calculate the magnitude l of how much an object at distance d appears to be raised due to Refraction:
(14) 
 
where^{'} 

Note: The apparent lift of an object l due to refraction increases with the square of the distance. If the object lies far behind the horizon, the magnitude of its relative raising l_{rel} with respect to the horizon also increases accordingly, although not so strongly, as the nearer horizon also raises with respect to EyeLevel, but to a less extend corresponding to the shorter distance from the observer.
For example, if a mountain is 2000 m high and appears at an angular size of 0.5°, and the RefractionAngle is 0.25°, the mountain apprears raised an amount of 1000 m. Note that this calculation can be performed without knowing the distance to the object. The distance to the object is contained in the RefractionAngle. If the mountain is only 1000 m high, its angle size is only half as large: 0.25°. We again obtain the same amount of raising of 1000 m as for the higher mountain, which proves that the raising depends only on the distance, not on the size of the object.
Since the horizon also raises due to Refraction, the raising of an object that lies behind the horizon appears correspondingly less with respect to the horizon. If the object is in front of the horizon, it even lowers with respect to the horizon, although it is raised in absolute terms. This is because the distant horizon appears to be raised more than the object. The greater the distance of the object from the horizon, the greater the relative increase/decrease with respect to the horizon. Very distant Mountains can therefore be raised a considerable proportion of their size beyond the horizon. Thus, Refraction can make mountains, hidden behind the curvature of the earth, visible again to a large extent.
The RefractionAngle, the angular size and the relative and absolute raising of the object is displayed in the graphic of the App if the option Show: Data on the Views panel is activated. Or they can be read in panel Refraction.
The earth is huge in comparison to us humans, a diameter of 12,742,000 m compared to human size of 2 m. So huge, indeed, that we are not able to see their spherical shape with the naked eyes from the surface of the earth. We can measure the distance to the horizon and its lowering due to the spherical shape only with precise technical instruments. Only from high altitudes or from space can we clearly see the ball shape of the earth.
Even at altitudes of several kilometers, such as the cruise altitudes of airliners, the spherical shape can not always be clearly identified. A slight curvature can only be detected on wide angle images. It must be taken into account, however, that wideangle lenses can distort the scene. On cheap cameras or smartphones the curvature can therefore only be observed to a limited extent.
The visibility of the curvature is therefore dependent on the altitude and the angle of view respective the focal length, i.e. zooming!
The fact that the horizon is lower than EyeLevel can not be recognized by the naked eye, since in nature there is no eyelevel line above the horizon. However, this drop can be seen with appropriate instruments such as an Overhead Display of an Aircraft.
When zooming in on the horizon along a long straight object like the Causeway Bridge, it looks like the earth is a cylinder. This is due to the Perspective on a Globe.
In order to prove that the calculated blue grid actually reflects reality correctly, the grid can be matched with a real photo.
To superimpose a grid onto a real photo the right way, the following information is required:
Use particularly shootings from high altitudes, for example from an airplane or from space. At lower altitudes, no curvature is clearly visible.
Procedure:
Set the altitude with the blue slider or enter the value in the Height input field. Select the focal length or the corresponding angle of view with one of the black sliders. Select the aspect ratio of the image at AspectRatio. With the green sliders Tilt and Roll the viewpoint and the banking angle can be adapted to the photo.
Cut out the area inside the black frame with a program like the Sniping Tool from Windows. Open the photo in any image editing program. Place the cut out area of the grid in a new layer above the photo. Scale the grid plane so that the aspect ratio is maintained and the grid layer becomes the same size as the photo. Set the blending mode of the layer to multiply (or something like that). It may be necessary to move the grid layer slightly and rotate it if the settings of Tilt and Roll do not match exactly.
If everything was done correctly, the grid would now match exactly with the image of the earth's surface. The following photographs show how the results can look like:
The International Space Station ISS orbits the earth at an altitude of 400 km. From this altitude, the earth clearly shows itself as a sphere. I now wanted to check if the calculated grids match with photos taken from the ISS. For this I searched original photos, in which data about the camera and lens used is stored in the images in the EXIFFormat. The reason is, in order to get the correct perspective representation, I have to enter the focal length of the used camera in the Curvature App.
I have found several such images on NASA's website. Below are two such examples with and without superimposed grid:
For the above picture, I used an original photo from NASA. The image was edited according to EXIF data with Photoshop, probably only converted to a JPG. I can't find traces of a composit procedure or any other manipulation and the noise is as expected from a camera with the selected settings.
I have set Height = 400 km and 35mm focal length f = 28 mm in the Curvature App. With Tilt and Roll, I rotated and moved the graphics according to the photo, because the photographer did not aim at the horizon. Then I created a screen copy of the graphics and opened it together with the photo in Photoshop. I have placed the graphics on top of the photo on a new layer and inverted the colors. The graphics and the photo have the same aspect ratio of 3:2. I had to scale the graphics, however, so that it got the same size as the photo. After that, I superimposed the graphics over the photo with blending mode "negative multiply".
And look, the graphics fits exactly to the photo. The lines have a spacing of GridSpacing = 48.91 km. The Gulf of Suez fits exactly between two lines. In Google Earth measured I get about 50 km. So this also fits perfectly.
Below is another picture of the Earth taken from the ISS photographed with the same camera. The superimposed grid of the simulation fits perfectly also here. The faint gray line corresponds to the eye level, i.e. the horizon of a FlatEarth.
Here are some screen shots taken from the video: GoPro Awards: On a Rocket Launch to Space, which was recorded with a GoPro4 camera with a fish eye lens. I applied the lens correction of Adobe Lightroom to it and after that the images fit perfectly to the calculated grid:
Height = 120 km, Focal Length: f = 18 mm, Camera GoPro4
The horizon has exactly the same curvature on all images at every position after applying the lens correction.
FlatEarther claim that the horizon is always at eye level, which it would be if the earth were flat. The definition of eyelevel is that a line from the eye of the observer to a distant point at the same height forms exactly a 90° angle to the vertical at the observer. The distant horizon of a FlatEarth would apparently reach to the eyelevel and thus form a 90° angle.
A dipangle from eyelevel to the real horizon can not be estimated with the naked eye, since a corresponding reference is missing on the horizon. Just looking straight at the horizon and claiming that it is at eye level, so forms exactly a 90° angle to the vertical, is a false claim. This is true approximately for low altitudes only. In an aircraft at an altitude of 11 km, the horizon drops 3.36° (see DipAngle in the Curvature App). This is a clear drop, but not recognisable with the naked eye because there is no reference.
The following photo was taken with the Theodolite App with an iPhone. The aircraft flew at an altitude of 33,709 ft, as noted in the picture at the top/center. The iPhone was aligned so that the crosshair shows eyelevel on the horizon. This is the case when the ELEVATION ANGLE shows 0.
The calculation results in a dip angle of 3.252°. The horizon is 20.53 km below EyeLevel and is at a distance of 361.6 km. The overlayed grid lines have a spacing of 8.035 km. These values are all calculated by the Curvature App.
I did not take the picture myself but found it at BlogSpot. There is a copy of it on my website. I own the app Theodolite on my iPhone and I know how it works. I calculated the focal length of the iPhone by measuring the angle of view, which I could do with the app. The calculated angle of view of about 65° for the diagonal coincides with data on the Internet. It corresponds to a 35mm focal length of 33.9 mm.
The values: Height = 10,275 km, angle of view 65° and display aspect ratio 16:9 I entered in the simulation. Then I cut out the simulation image along the black frame, scaled it to the same size as the photo and overlayed them with the blending mode multiply. As you can see, the calculated image fits exactly to the photo and shows exactly where the horizon of the Earth is with respect to EyeLevel. Note that a very slight curvature is barely visible on the grid but because of the haze at the horizon not as visible on the photo.
How to observe the horizon drop with a simple home made tool is shown in the following video: Horizon Drop at Varying Altitudes. FlatEarth Debunked. from madmelon101.
Airplanes can be equipped with overhead displays. These displays are pushed between the pilot and the front window. When the pilot looks out of the window through this glass screen, he can see all critical flight informations like artificial horizon, speed, altitude, vertical speed, heading, even the runway, and also the terrain like on a night vision device. It is remarkable that the displayed graphics moves in sync with the head movement of the pilot. It looks like the graphics are projected onto the terrain.
If the aircraft is now cruising at high altitudes, in the image at 39,000 ft, the real horizon lies about 3.5° below the eye level due to the earths curvature. The display projects a horizontal line at eyelevel into the scene. In the picture you can clearly see the distance between the eyelevel line and the real horizon.
The stylized airplane in the display shows the effective flight direction. In the picture, the symbol lies on the horizontal eyelevel line, which means that the aircraft neither climbs nor descends. It is located to the left of the center, which means that the aircraft does not fly straight ahead but is pushed sideways to the left from the wind (see arrow on the top left). The aircraft must correct for this deviation by pointing the nose into the wind according the arrow, so that it does not miss the destination. The autopilot automatically performs this correction.
More evidence the Horizon does not remain at eye level as you gain altitude. for an explanation from a pilot.
The line of sight to the horizon is rarely a straight line as assumed by the simple formulas, but is curved downwards due to the temperature and pressure changes of the atmosphere near the ground (refraction). This means that you can see much further than the calculations with the straight line suggests.
In extreme cases, e.g. if warm air is above cold water the refraction can lead the light hundreds of kilometers along the water surface! The result is that the earth is seemingly flat.
Source Wikipedia: https://en.wikipedia.org/
This fact has been known for centuries among land surveyors and seafarers.
Note: You can trust your eyes only at short distances. Over large distances, the light path through the atmosphere is disturbed in an unpredictable way. It's nothing like it seems!
In the excellent video FLAT EARTH  EXPERIMENT  TELESCOPE from 01.08.2016 the author Alex Chertnik shows how to measure and document measurements with the telescope over water the right way. He measures over three similar distances on different days and at different times of the day, how much of 4 about 300 m high chimneys is hidden by the curvature of the earth.
In contrast to all flatearth videos he considers the refraction in his calculations. His measurements correspond exactly to the calculations for a globe earth with a radius of 6371 km, taking into account the standard refraction.
The video shows clearly how the image wobbles and flickers due to the fluctuations of the refraction, and that the Horzont is not a clear horizontal line, but shows wavy distortions. These waves come only to a small extent from the water itself, but arise through the refraction. The occlusion fluctuates by many meters due to these refraction waves.
Note: the refraction directly above water can be much higher than the standard 7%!
The video proves very clearly that the earth must be a ball.
Great site, very informative, very well done! Thanks for this great work
Gerard...
YouTube channel Kelly White
Excellent tool and information. I'm just having a comment conversation with someone who doesn't quite understand this, but your tool will really help. Thanks!
Very interesting Walter. You have a amazing mind.
This is awesome. :)
Super, das!
Aber kann ich irgendwo ablesen, wie *weit* der Horizont ist, links nach rechts?
Excellent work! Could it be possible to adjust the refraction parameter as well?
It's a nice job and it's very impressive but I don't like how the yaw also gets lower and lower as the altitude rises. :'(
Going up in a vertical elevator/balloon in real life wouldn't look like that, rethink about that part because the rest is top notch! :D
Phil, choose option HorizView = EyeLvl to keep eyelevel at the same position.
Herr Gnorts: siehe das neue Feld im Computed Values Panel DisplHorWith.
Risto: Refraction is now implemented, see Refraction Panel and some of the new Animations.
Refraction can sometimes have the reverse effect of making objects in the distance seem lower than they actually are. The phenomena is called "sinking", and it can sometimes actually cause distant objects to disappear behind the horizon when they actually don't. Flat Earthers have actually cited this as the explanation for ships disappearing beyond the horizon and the towers in Soundly's videos curving downward. How should rational people respond to this claim?
@Everett Anderson
To produce Sinking instead of Raising, compared to StandardRefraction, the atmosphere must have a steeper lapse rate than normal. Laps rate is the negative TemperatureGradient dT/dh. However, there isn't much room to play with: the Standard Atmosphere already has a lapse rate of 6.5°C/km, but convection limits lapse rates in the free atmosphere to about 10°C/km. For Refraction to be 0 the lapse rate on Standard Atmosphere should be 34.3°C/km. To bend light upward it has to be even greater, which can only accour in thin layers. To get a temperature decrease of more than 34.4°/km you must have a hot surface with a layer of cold air above. Such conditions produce heavy distortions and mirages of differend kind rather than only Sinking, because the air is instable.
Because the density of undisturbed air increases with decreasing altitude light is generally bent only downwards. Only specific changes of temperature gradients near the surface can locally change the density gradient in such a way that distorted and mirrored layers and some Sinking may appear. On most images we see already streching and mirroring at the lowest layer even when the overall image is still lowered by the average Refraction. These distortions are caused by small layers of cold air above warm surfaces.
More Informations: Looming, Towering, Stooping, and Sinking
This sim has a fundamental flaw.
What you can see is limited to the aperture through which the light passes.
Where is that calculation?
I should also say refraction values is a wild guess at best. For a simple reason. There is an assumption of linearity. This is a misplaced and provably wrong assumption.
@indio007
Quote: What you can see is limited to the aperture through which the light passes. Where is that calculation?
First, Aperture does not limit or influence which part of a scene is depiced on the sensor. The loss of light when closing the aperture is compensated by longer exposure and higher ISO values. You see the exact same thing. With aperture you can influence the Depth of field.
So Aperture does not change the shapes or even the relative positions of objects in a scene. Look at the Animations and then look at the real images I linked above the App when an animation is choosen. Simulations of the Globe Earth and the images match, but the Flat Earth simulation does not match at all. And the simulated situations are taken from videos Flat Earthers provided, by the way.
Second: The App does not simulate a real camera, only the projection part of 3D objects to the focal plane of a camera, without aperture, exposure times and ISO settings. In such a projection there is no such thing as Aperture, as in a drawing there is no such thing as aperture. But the computed 2D image from 3D objects is accurate.
Quote: I should also say refraction values is a wild guess at best. For a simple reason. There is an assumption of linearity. This is a misplaced and provably wrong assumption.
It is not assumed that the density gradient of the atmosphere is linear in reality. But on sufficiently small scales, systems can always be approximated as linear (like the globe earth can be approximated as a flat plane on small scales). Linear approximations are the normal way most mathematical models of physical systems are derived. Its an application of calculus and results in differential equations that can then be applied on any (nonlinear) situation, within the limits the model is intended for.
So the math of how atmospheric layers bend light rays can be derived from a linearized model and this math can then also be applied to real nonlinear systems. Why is that so? Because a mathematical model is universal. It maps an input (density gradient) to an output (bending of light). You can derive the mapping function from linearized systems. But you are not restricted to use it only on linear density gradients. It also works on any gradient, because the math model is a representation of the real physical system and can be used to predict the outcome of any input.
That is true for all derived physical laws. E.g. Newtons law of gravity is not only applicable to simple linear systems but are universal, as long as gravity is not too strong and the speeds involved are much slower than the speed of light. Physicists know this limitations and know that Relativity must be applied on those conditions.
Please read my 2. paragraph of What is Refraction?. There I explain why and how average values, derived from many, many real measurements by surveyors, may be applied to get the average overall effect of Refraction. You can always approximate a real physical system by a simplified average version, superimposed by perturbations. The perturbations are often smaller than the simplified part and do only slightly perturbate the outcome. We see that in real images where Refraction takes place. The overall image of an object may be raised by the average part of Refraction but then some parts of it are streched, compressed or mirrored atop of it.
My App simulates only an undisturbed average Refraction, because you would have to provide me with the real atmospheric conditions (which change all the time) from the observer to the object for each light ray, so you can never get this data anyway. But the undisturbed approximation suffice to get the concept. A mountain is raised by the calculated amount of Refraction whether its image is disturbed or not.
If you want see simulations of Refraction with complicated density gradients, see Introduction to SuperiorMirage Simulations.
may I suggest a feature request? it would be nice to have the ability to have permalink to the simulation, with all parameters embedded in URL, maybe as a long JSON string in a URL parameter. or if that's not possible, the ability to copy or save all parameters in a JSON string.
@Priyadi
This features are now implemented. See Save/Restore Panel below the App Window.
Check this out here:
Awesome work, Walter.
I was fumbling around with geogebra just to make simple views and had to give up a few times.
Your interactive model is brilliant.
By the way, an idea: What about another that shows the relative sizes of the Earth and moon using distance and lens focal length, libration, eclipses, INCLUDING the specular aspect of it's surface  phases and the illumination of the moon from an angle almost right behind us...?
Just came across your site. Upon first look it appears to be well done. I will look at it more critically in the days to come. I have also engaged in refraction theory and made many measurements / observations with a theodolite total station (Topcon GTS3C) in varying atmospheric conditions and will check over your work.
Seems like you put together a worthwhile blog.
George Hnatiuk
BTW:
It would be VERY useful to be able to pass created demonstration diagrams in a url...
So the diagram parameters are ready and presented when someone visits...
This is a beautiful app. Nice job. (Is there a pause button I didn't see?)
Regarding horizon dip angle, it's not necessary to have an eyelevel reference if opposite horizons can be seen at the same time. If a straightedge can be pointed at one horizon, it will point above the opposite horizon with twice the dip angle. This could be done with two small mirrors mounted on adjoining faces of a cube, with the whole thing mounted on a ruler. Then you just take a picture looking at the mirrors. I haven't built it. Of course there are plenty of other ways to determine "level", like a line between the tops of opposite windows in an aircraft cabin. (I'm sure the theodolite app is the simplest, but I like lowtech :)
Grahame, It would be VERY useful to be able to pass created demonstration diagrams in a url...
You can do this in the panel Save/Restore. Use the button Get App Url. Then copy the generated url into eg YouTube comments. Clicking this url will open the App and restore the corresponding state.
Note: you can also copy such an url into the text field in this panel and click Set App State. The button Get App State gives the currect state in an editable json format. You can change values and add text in DemoText and Description. Then click Set App State to show this state. Use Get App Url again to get the url with added text.
Do start an animation via an url to this page, simply add &demo=xxx where xxx is the name of the demo button.
Walter,
I really do want to thank you for this website. I have always been somewhat of a conspiracy theorist, so naturally, YouTube recommended me Flat Earth videos. About 6 months ago I watched quite a few, and while I did not think that such a claim could be possible, some of the videos were somewhat convincing; due to flat earthers taking things out of context and purposefully misrepresenting anomalies (such as refraction ) which can easily explain away their claims. I was turned on to your site from MetaBunk, and I am so glad I found it. It is constructed excellently, yet is simple. You clearly are very smart to be able to set up all of the simulations, yet you make them easy enough to use so that everyone can try them and actually learn things they likely never would have. I really do appreciate your site from saving me from the unintelligent Flat Earth rabbit hole, and I recommend it to Flat Earthers, as well as people in general..
Thanks again,
Nick
Very impressive but thats a lot of science that should never have been required. I don't understand the maths, but then again I don't need to. I know the earth is a globe, logic tells me it cant be anything other than a globe. For all the poo pooing by the FE brigade of any sensible explanation as to why the earth isn't flat, they wont argue why the internet isn't awash with photos taken through telescopes with captions such as heres my picture of the Cuban coastline taken from the coast of Ireland. or heres my photo of Australia taken from South Africa. Need I go on.
well this calculator is false, refraction doesnt work on flat earth model here in this calculator, the models dont have sky, the calculator is therefor biased in favor of a globe.
Globe: Flat Earther have no working math model of Refraction that I could incoorporate into the App, sorry, that is not my fault. If I apply the working Standard Physical Model of Refraction, the Flat Earth would look like a Hollow Earth. To make the Earth appear like wie see it, on the Flat Earth model light has to be bent upward by Refraction. This contradicts physical laws. Light gets alway bent toward the denser medium. The density of air decreases with increasing altitude, so light gets bent downward. The consequence is, that on the globe on strong Refraction it can look flat and mountains that are hidden behind the curvature can appear raised over the horizon.
If you want to simulate how the Flat Earth looks by using a false Refraction Model where light gets bent upwards, you can use the Globe Model and use values for k = 1 (no Refraction on FE, earth looks flat) to k = 0 (severe Refraction on FE, earth looks curved as a Globe with radius 6371 km).
I look forward to your calculator for Flat Earth Refraction.
23indio007 12/29/2017  07:38
Your refraction section is complete nonsense. Yes the math is sound but it has no connection to physical reality. What empirical test where used to validate the model?????????????? NONE. While I am on it, you people think that the refractive index proscribes some universal bending of light. NO! that's not how it works. It is the CHANGE in the refractive index that bends the light. The real physics is this....
1. The refractive index is the ratio of the speed of light in the medium of interest to vacuum speed. The change in the speed of light from one medium to another is what causes the change in direction. Well, air is very close to vacuum. The difference is in the fourth decimal. The difference between different air refractive indices is in the 5th and 6th decimal. The change is very small therefore the bending is very small IN THE REAL WORLD.
2. Plus we generally are using a constant altitude for line of sight. There is little of no change in the refractive index because the density and pressure do not change at a constant altitude. You have to concoct a change in altitude by assuming a globe and pretending a beam of light traversing two mountain peaks goes high<>low<> high in altitude. You assume the globe and prove the globe which is no proof at all. If you assume flat there is no change in refractive index and therefore no bending of light.
3. You say standard refraction.. Ok how was that derived??? ahhh yes, they made a model by assuming a structure of the atmosphere and the making a model that should work based on the assumptions NOT EXPERIMENT. There is no empirical basis but you guys just run with it. Little do you know, that the refraction approximation can't be used at large zenith distances because the refractive index BLOWS UP at 89° from zenith. At 90° from zenith. (looking directly at the horizon) light should curve so much you should see the back of your head. That's not physical , so the model is NONPHYSICAL Source: Page 108 and fig 15 Title: Understanding astronomical refraction
Authors: Young, A. T.
Journal: The Observatory, Vol. 126, p. 82115 (2006)
Bibliographic Code: 2006Obs...126...82Y
(url)
4. Now that it's shown that the model actually models nothing and your model is a derivative of that model are you going to correct your error? You have no math and you have no empirical data. That means "standard refraction" applied to the horizon to 1° is inept. Un;less of course you think there is infinite refraction.....
5. there is no empirical data on refraction because it can't be measured consistently. The measurements don't match any model. Measurements are literally all over the place. Read this Empirical Modelling of Refraction Error in Trigonometric Heighting Using Meteorological Parameters (url)
6. Stop pretending that refractive index equations are certain and accurate. They are not. 2 seconds on google shower will show you 100's of papers wher people are still trying to come up with a model.
7. The fact is the refractive index is based on the dielectric constant. Traversing the boundary of 2 mediums with different dielectric constant is a complex equation in which the imaginary part is nonlinear but absolutely effects how a beam of light traverse the boundary. The refraction model is overly simplistic and doesn't model reality. it models a need to come up with something ... anything to validate the curve of the globe.
8. refraction in atmosphere was invented before the time of Pliny the Elder who wrote about the seemingly impossibility of the Moon and Sun being over the horizon during a lunar eclipse. Refraction of air was invented specifically to make the globe work!
9. Some things never change.
@indio007, truth
Complete nonsense? Can you provide me a better model? Do Geodetic Surveyors measure the earth with false models since centuries? Do you deny that airplanes and ships find their destinations over thousands of miles exactly, by using spherical maps made by Geodetic Surveying? Can you present any Map that is demonstrably false?
I use the Standard Model for Refraction as used in Geodetic Surveying. The values used in Geodesy are empirically derived from many, many measurements in reality. The first common used value k = 0.13 was introduced by Gauss.
Sources I use are:
I'm aware, that Refraction is strongly dependent on the current atmospheric conditions along the line of sight to an object. As nobody can provide the exact variing refraction values along any line of sight, we can not simulate Refraction exactly. But we can make many measurements under different atmospheric conditions and derive from them empirical formulas for Terrestrial Refraction for certain atmospheric conditions. This is applied in Geodetic Survey. The empirical Formulas vary slightly between countries. I used the formula presented in Wikipedia.
My App does not claim to simulate Refraction with all distortions in various layers of air. It only gives a rough picture how a certain Refraction affects a scene. With my App you can estimate how strong Refraction has to be to bring a hidden mountain into view as seen on some pictures. You can simulate the overall effect of Refraction on certain scenes. Not more, not less.
Did you read my section about What is Refraction? Cite: The density of the atmosphere generally decreases exponentially with increasing altitude. Any density change causes a refraction. If the density change is not abrupt but continuous as in the atmosphere, the light is not refracted but bent, but we call it Refraction anyway.
Yes indeed. Standard Refraction bends light in the atmosphere on an arc with a radius of about 7 times the radius of the earth. A very, very small bending indeed. But: the effect of this bending increases with increasing distance to the object and the angular size of objects descrease with increasing distance. So even a very small bending can raise a very distant mountain above the horizon. See Demo Canigou.
On the globe a straigt line of sight always passes through layers of different altitudes due to the curvature of the earth. That causes different densities along the path, different refractivity and so bending of light.
I do not assume a globe. I present two models to make predictions: Globe Earth and Flat Earth. Refraction is only an additional feature, that can explain certain scenes like distant mountains and cities raised into view. You can enter known values for objects and observer and the App displays predictions for both models. Then you can compair the predictions with reality. Reality prooves one or the other model. My App does not assume a globe. My App proves nothing. My App makes predictions.
That is a false claim. Where did you get this false infos from? The values for Standard Refraction are derived using controlled setups of simultaneous reciprocal vertical angle measurements. Empirical formulas for different atmospheric conditions are then derived from this real measurements.
I don't know where you guys get this false infos, that scientists make their theories out of nothing. Every theory is developed to explain observed facts, not the other way around. First you observe and measure a fact, then you make a hypothesis that could explain the facts, then you build a mathematical model to quantify the hypothesis to make further predictions to check the hypothesis against reality. If the hypothesis always matches reality, does not contradict other theories and explains all current data, then it finally becomes a Scientific theory. A scientific theory falls or must be adjusted as soon it contradicts new findings. A scientific theory is not proof. It's the simplest and best description/model we have for certain observations.
That is possible indeed, except we cannot see a distance all around the globe, because of obstacles, diffusion of light and variing Refraction. If Refraction k = 1, light is bent along the surface of the earth for hundreds of miles. This can be observed in reality quite often under certain conditions. But you confused Terrestrial Refraction with Astronomical Refraction.
"My" Globe Model is not based on Refraction! My Globe Model is based on a sphere with radius 6371 km. There is nothing to correct. Refraction is only an additional feature. You can set Refraction to zero anytime. It does not change the Globe Model.
Refraction is not an invention but an observation, see Monitoring of the refraction coefficient in the lower atmosphere using a controlled setup of simultaneous reciprocal vertical angle measurements.
Nowhere do I pretend this. You cannot falsify the implemented Globe Model by refering to (well known) Refraction, which plays a minor role here anyway (I implemented this feature only in a later version). The Globe Model itself has nothing to do with Refraction. It is simply a geometrical Model of a Sphere. As simple as the Flat Earth Model is a geometrical Model of a Plane.
Forget Refraction and compair the Globe and Flat Earth predictions with the pictures presented. You won't find any match with the Flat Earth Model, with or without Refraction.
And why do you deny the observed dip of the horizon as presented on this page at all? This can not be explained by Flat Earth Model at all.
I love this App, it shows so much but is still simple to use. I have seen a number of people suggest that Soundlys videos of the causeway indicate that the earth is actually a tube of 10km diameter, that's why the horizon is flat but the causeway curves. it would be interesting to model this just to show what nonsense that is.
Rob Smith: I have seen a number of people suggest that Soundlys videos of the causeway indicate that the earth is actually a tube of 10km diameter, that's why the horizon is flat but the causeway curves. it would be interesting to model this just to show what nonsense that is.
See my new Blog Article Perspective on a Globe. In this article you can play with an App showing a grid of a Globe Model to see how it is distorted by perspective especially when viewing from low altitudes.
Thanks for all the work on this it obviously took a lot of time. Hopefully some middle schools or high schools are getting to use this.
I took a picture of the San Mateo bridge in the San Francisco bay from Seal Point. The bridge is the longest in California and has a 5 mile (8km) flat causeway section. I was about 8 miles away (13 km) and it looked very much like a Soundly picture.
Just dropping by to say brilliant work, it wont do a thing to convince idiots that their magic god disk "theory" is bullshit. But it's still brilliant.
It is obvious that NASA does have the same or similar computer model of a globe where they project the cgi composits of the earth on it and project it under the (fake) ISS station. The live images under the earth under the (fake) ISS is computer generated and not real. Why dont you compare your ball to the fake images from the moon since we "know" how far the moon is away from us? Since the believe of a ball earth is based on NASAs fake moon landings, why dont we go back to the foundation of it all? You think is some sociopath does lie to you once he will not lie to you twice?
Paul. There are thousands of prePhotoshop images of the earth from space and the moon and hundreds of analog videos. There are tens of thousands of pages of documentations of the Apollo project, the missions and the science experiments. There are hundreds of original audio clips of the missions. The rockets were built and tracked by astronomers all over the world to the moon and back. We have hundreds of kg moon rocks that are analyzed by thousands of people world wide. To fake the rocks it would take hundreds of years. We have science experiments on the moon that have sent results to earth even when the Apollo programm was finished already. You can read the reports online. We have retro reflectors on the moon to reflect lasers from earth to measure the distance to the moon almost daily with sub cm accuracy.
It is cheaper to go to the moon and land on it than to fake all this.
We have currently space probes in low moon orbits that send images of the landing sites where you can see the remains of the missions and even the paths of the astronauts. They send images of the earth too. We have non NASA landers on the moon right now that send images from the surface that look exactly the same as the footage from the Apollo missions (eg. no stars, no thrust crater, same color of the moon).
So, why would NASA produce all this and put it online for moon hoax believers that are too lazy to read them anyway?
Hoaxers call all images fake, because they look different as they expect. Of course they look different. Space and moon are different. If you had a basic understanding of physics you would expect exactly what we can see on all the footage.
Here are the links to all about the Apollo missions:
Apollo Lunar Surface Journal from the National Space Science Data Center NSSDC
All mission reports, images, videos, audios, transcripts, science reports, sample cataloges, technical debriefings, biomedical results, scientific results and many, many more:
Preliminary Science Reports Apollo 1117
Master Catalog
The NSSDC Master Catalog is available for the queries pertaining to information about data that are archived at the NSSDC as well as for supplemental information:
National Space Science Data Center
"The National Space Science Data Center serves as the permanent archive for NASA space science mission data. "Space science" pertains to astronomy and astrophysics, solar and space plasma physics, and planetary and lunar science. As the permanent archive, NSSDC teams with NASA's disciplinespecific space science "active archives" which provide access to data to researchers and, in some cases, to the general public. NSSDC also serves as NASA's permanent archive for space physics mission data. It provides access to several geophysical models and to data from some nonNASA mission data."
i have an issue with all the available curvature calculators that i ran into, non of them seems to account for the relative tilting of extremely tall objects over huge distances.
Example : 2 observation points with same height across quarter the circumference of the earth with the line of sight tangent to the earth's surface.
i realized that when i was testing my spreadsheet curvature calculator. i trust my calculator but i still need some assurance or confirmation and if i'm right about my conclusion then why dont they account for it (for me accounting for the tilt was not an objective but was just derived naturally as part of my equations)
my level of knowledge with geometry is not up to your level, i'm just clever in using the little that i know, so i hope you can bare that in mind if you are going to explain stuff for me
I have an issue with your opening statement... "For us living beings on the surface of the earth, the earth looks flat. For this reason the socalled FlatEarther (FE) claim that the earth is a flat plane rather than a globe."
It is not that the earth simply looks flat.
It is because we are able to see things at a distance that should be obscured by the curvature of the earth, especially large lakes and the ocean.
We would not believe the earth is flat simply because it looks flat. In fact, we have been conditioned to falsely see the curvature at the point of the horizon, and it does seem that objects go over the horizon until zoomed in on.
Thanks for your work on this calculator and app... I will be using it with local observations I have made at 100km over the ocean.
Michael: It is not that the earth simply looks flat. It is because we are able to see things at a distance that should be obscured by the curvature of the earth.
How do you figure out what should be obscured by the curvature of the earth?
It heavily depends on the distance, observer height and refraction. If you take all this into account, you can compute exactly how much of an object in the distance is hidden, eg by using my Advanced Earth Curvature Calculator. And you would be surprised, how much more should be expected to be seen in the distance.
The problem with many flat earther is, that they ignore refraction completely, get the observer altitude wrong, because they think it does not matter a lot, and even use wrong formulas to calculate the hidden part (8" per mile squared).
So you must know the distance, the observer height and refraction to predict how much is visible and hidden.
Many flat earther think, refraction is an excuse and can not raise whole mountains or cities into view. But they don't know, that the apparent raise due to refraction is more, the farther away an object is, because refraction acts on a longer distance, and objects appear smaller with increasing distance (angular size decreases). This two effects work together. For example, if you have a small refraction angle of 0.1° and the angular size of a mountain in the distance is only 0.2° (that is a 873 m high mountain in 250 km distance), it get raised 436 m, ie half of its height! See Mountain Canigou Demo.
By measuring the earth in geodesy over long distances they always apply refraction to the measurements. They either measure the temperature gradient, humidity and pressure of the air and calculate refraction from that, or yet better, they use special instruments or special configurations, that are able to measure refraction directly along the line of sight. Nowadays the use GPS, which eliminates the effect of terrestrial refraction. Astronomical refraction is still taken into account, because this influences the signal between GPS satellite and receiver. But this is done by the stations automatically. We get the positions of such stations to cm accuracy  and find without any doubt, the earth is a globe with some small irregularities.
Refraction is often underestimated. At heigher altitudes above 50 m it gets quite standard between 0.13 and 0.17 in most atmospheric conditions. But in winter near the ground eg. at 1.5 m you get mean refraction of 0.8 quite often, as studies have shown. Go lower and you achieve refraction values even greater than 1. Which means, the light follows the surface of the earth for hundreds of miles. The earth looks perfectly flat in this cases and you can see a laser from the ground.
So you have to factor in all relevant parameters to calculate the hidden/visible part of objects in the distance! You have always to apply at least standard refraction for altitudes above 50 m. Below it can get much higher or even negative (mirages).
Please use my Advanced Earth Curvature Calculator to compute what is expected to be seen taking refraction (and angular size) into account.
You ROCK STAR! I am in awe. Well done!
Very useful app. Thx.
But there is another thing that should be included. Geoid.
It is another thing why are objects visible at a long distance.
https://en.wikipedia.org/
Pavel: But there is another thing that should be included. Geoid. It is another thing why are objects visible at a long distance.
Geoid variations are neglegtable compaired to the effect of Standard Refraction, especially at dinstances over 50 km. The greatest Geoid variations I have found is less than 5 m / 100 km, corresponding to an angular size of less than 0.003°. Standard Refraction raises an object at 100 km about 127 m, which corresponds to a refraction angle of 0.072°. So the Geoid variation over 100 km is at most 4% of standard refraction. Over longer distances the Geoid variation compared to refraction gets even much smaller (see table) because refraction lift increases with the distance squared, while Geoid deviations are at most increasing linear with distance:
Note: In the table standard Refraction k = 0.17 and observer height 2 m is assumed. Refraction lift is the apparent raising of the target with respect to the horizon line, lift with respect to eye level is even more.
Distance  10 km  20 km  50 km  100 km  200 km  500 km 

Max Geoid variation  0.50 m  1.00 m  2.50 m  5.00 m  10.0 m  25.0 m 
Std Refraction lift  0.63 m  3.94 m  29.9 m  127 m  522 m  3313 m 
Geoid/Refraction  79%  25%  8.4%  3.9%  1.9%  0.8% 
Over shorter distances than 10 km refraction lift can be ignored, because objects are not hidden behind the horizon. If you know the Geoid variation you can correct the hidden value accordingly. But if the Geoid variation between observer and target is not a hump or valley, but rather a constant slope, then the result of my App is the same as without Geoid variation. If the Geoid is a hump then you can lower the refraction value a bit to account for the fact, that more is hidden. If the Geoid is a valley you can increase the refraction value to account for it.
As we generally don't know the exact refraction coefficient anyway, we can simply ignore Geoid variations.
I'm absolutely not convinced of all your explanation sir, knowing that any kind of videos, images and pixels are not scientific proofs, and as I know gravity is still not proved in 2018, or, just show me the gravity equation that magically curve the water of all oceans.
Also I would like to know why in 2018 we, everyday people are still not experiencing space, as promised 10, 20 and 30 years ago? something wrong maybe? let's continue to ask questions as long as we don't experiencing the reality they show us in TV.
thetruthis: knowing that any kind of videos, images and pixels are not scientific proofs
There are no scientific proofs. Science can not and does not prove anything. Math can prove. Science delivers evidence and explanations and makes mathematical models that describe observations and lets us predict observations. You need to have predictability to be able to falsify a hypothesis. You have to know what outcome of an experiment you expect precisely to be able to compare it width observational data.
I have created a computer model (is also a kind of math model) that predicts the outcome of observations. Images of objects are observations. You can measure the shape of observed objects from images quantitatively. If you know that an image of an object has no or very little distortion, you can compare it 1:1 with the prediction of the computer model, which I have done here. If the image matches with the globe model it is evidence that the model is correct. In this case, the image eg. of the causeway matches exactly the model, so this is evidence that the model of a globe earth with radius 6371 km is right. The image does not match the prediction of the flat earth model at all, so the flat earth model is falsified  the wrong model. I can even measure the radius or the refraction from an image and the model.
thetruthis: and as I know gravity is still not proved in 2018
Sunlight is still not proved too. But both sunlight and gravity can be observed by our senses and measured. You know if you are standing up or upside down because you feel gravity. Both are real. Do you see how ridiculous your comment is?
And again science does not prove anything.
But we have a Scientific theory of gravity, which is one of the best tested theories we have. A scientific theory is an explanation of an aspect of the natural world that can be repeatedly tested, in accordance with the scientific method, using a predefined protocol of observation and experiment. Established scientific theories have withstood rigorous scrutiny and embody scientific knowledge.
thetruthis: or, just show me the gravity equation that magically curve the water of all oceans.
See my Earth Gravity Calculator for gravity equations for the earth.
How is water bent around the globe by gravity?
This has to do with the fact that the surface of any liquid in equilibrium always lies everywhere on the same equipotential surface, no matter what shape the equipotential surface has. An equipotential surface is a surface where the gravitational attraction has exactly the same magnitude everywhere on this surface. Each equipotential surface on earth is a sphere (or more accurate an ellipsoid) around the center of the planet, because on such a sphere the distance to the center of the earth is the same everywhere and thus the gravitational potential (attraction) on this sphere is the same everywhere. The equipotential spheres build layers of spheres with decreasing potential (attraction) with increasing distance from the center of the earth. So every equipotential surface is a sphere around the earth.
Now, any local deviation in height of a liquid from the equipotential sphere the surrounding liquid is lying on, causes the deviation to be attracted to the earth differently then the surrounding. This causes tangential forces at that location which produce surface streams of the liquid until the whole surface of the water is on an sphere with the same potential everywhere so that no tangential forces arise anymore. Water on an equipotential surface is said to be at the same level everywhere. It is level, but not flat.
So it is a natural process described and predicted by the laws of gravity that water curves around a planet.
You can fill water in a bucket and rotate the bucket around its vertical axes. This produces centrifugal forces in addition to the gravitational force. These forces together form parabolic shaped equipotential surfaces. So the water surface in the bucket conforms to a parabolic shape of revolution. Water is not always flat!
thetruthis: Also I would like to know why in 2018 we, everyday people are still not experiencing space, as promised 10, 20 and 30 years ago? something wrong maybe?
It's because everyday people do not earn enough money to pay such flights? Such flight are still much too expensive for any commercial flights. Who would spend millions, risk their life and probably get sick only to see earth from space for some time? But if you have the money you can go to space right now. Or become an astronaut. There were more than 500 people in space already, even a billionaire who is not an astronaut. As I said, if you have the money...
But now we are changing the topic.
In my opinion this is the best "flat earth calculator" site in existence. You not only made a superb job in building a very detailed, accurate, and easy to use simulator, but also augmented it with discussions and explanations on the geometry and physics behind notsowellknown topics such as refraction and perspective. These details are often misused (on purpose) by flat earthers in clunky attempts to justify their beliefs. It's good to have access to your collection of facts whenever one has to debunk some outlandish claim made by them.
I myself tried my hand at your source code, as a pastime exercise on java script. Attempting to add clouds to the simulated landscape (building on the objects infrastructure, but with a variable height). I have noting presentable so far, but maybe (when I retire) I will be able to come up with some usable piece of code.
Best, and keep up the excellent job!
any chance for fisheye / barrel distortion simulation for a future feature?
@Priyadi, this is not possible with vector graphics like this App is based on. But that is a good idea for a future refraction simulator.
We, living creatures on the Earth, can detect with our appliances and observation of sky, that our earth is spherical. But what if the sky is not what we are thinking about, and if there is negative refraction, though we are thinking it is positive? Due to ethernal nature.
@Nicolas, Refraction in generall and atmospheric refraction in particular are well known physical effects. Refraction is taken into account in astronomical obserevations and survey since ever.
Negative refraction is only possible in thin layers directly over the ground when its temperature is hotter than the air above. It produces a layer of inferior mirage, mirroring. Above that layer the refraction is always positive, because the air cools with altitude. This makes the earth even looking more flat than it is. So refraction is NOT the cause of the curvature measured, on the contrary. Surveyors and astronomers therefore never make measurements in low levels over the ground and always take refraction into account.
Did you know that Refraction is even measured and taken into account in measurements of large structures as wings of airplanes in montage halls, eg. by Boeing. Here again the scientific theory of refraction is confirmed to be correct. Temperature gradient in hall can be big, cold floor, hot a roof. For example: Temperature gradient in the hall: 0.5°C/m, wing size: 30 m, Refraction deflection: 0.4 mm.
There are many, many stduies about refraction, both theoretical and with practical experiments, to research how refraction influences surveying. Here are a few examples:
See also:
Negative refraction, or hypothetical negative refraction, is very thin and brittle thread, on which very opaque and weak hypothetical Flat Earth uncertainly hangs. And this thread could easily be cut... or approoved, if it is true. But, alas, in these materials, given in your links, I haven't still found such a scissors. In experiments and measurements, in which goes about terrestrial refraction, the known radius of Earth is the base point. But if we suspect some possibility of Flat Earth, whatever weak, we can't base our conclusions on known radius of Earth. Really, it is possible, but it was not done. Or, really, excuse me, I have not still seen it.
@Nicolas. Refraction is a well studied phenomenon. We can measure Refraction and correct the measurements accordingly. There are many methods to do so. Nowadays we have methods like GPS which are not subject to refraction. All benchmarks (reference points) are measured again and again over the centuries using different devices and different methods. They agree all within the error margins of the devices used. GPS is based on the WGS84 ellipsoidal model. You can download the full GPS specification so you can build your own navigation system. There is no doubt since centuries, that the earth is a globe. The fact that all maps and navigation systems, based on the globe, are correct proves the globe.
In experiments and measurements, in which goes about terrestrial refraction, the known radius of Earth is the base point.
That's because we know for sure that the earth is a globe since we began to measure the globe. But if you want you can still measure refraction without assuming a globe and correct your measurements accordingly. If you do so you will always measure that all targets in the distance drop due to the curvature of the earth, not by a random amound as expected due to refraction, but exactly as predicted by the globe model with radius 6371 km. If you know for sure the size of the globe, you can make your life as a surveyor easier by using spherical calculations. You don't have to prove the globe at each single measurement, if it was done already long ago and confirmed many times.
<comment> Here I have developed A Method to Measure Terrestrial Refraction without assuming a globe. </comment>
How have you got to equation 3 (RefractionCoefficient k) ?
Amir, read the line above equation 3, there is the link to the source in Wikipedia. But I found it in some documents about survey too. See Terrestrial Refraction in Wikipedia.
We can derive a mathematical model of the atmosphere from basic physical laws, experiments and observations. Such a model is very complex but can be approximated by simpler equations for practical purposes. But because the atmosphere is a complex chaotic system which can not be predicted beyond a certain accuracy anyway, we can just derive an equation from a simplyfied model of the atmosphere directly.
We know that light gets bent in a complex curve through the atmosphere, depending on local atmospheric conditions along the light rays. But we can approximate the complex curve by a circular arc with enough accuracy for practical purposes on normal atmospheric conditions (eg. no mirrages). This is how the equation 3 is derevied. The constants in this equations are gathered from measurements. They may be slightly different in different locations, times and seasons. But for the purpose of geodesy the equation is accurate enough if you know in which conditions it may be applied.
Comparing equation 3 with more sophisticated models showed, that you don't get more accuracy with complexer models in practice. You can't measure the atmospheric conditions over the whole path to calculate refraction precisely. But we can measure the actual refraction with instruments and check agains the calculations. As we know the conditions under which equation 3 is accurate enough, we simply have to ensure, that it is only applied on those conditions. If you need more accuracy, you have to measure the current refraction with dedicated instruments.
I have implemented an App that can simulate complex refractions like mirages from atmospheric conditions you have to provide: Simulation of Atmospheric Refraction
Hello, Great work!
I especially love the curvature calculator : Advanced Earth Curvature Calculator.
I have to say that I also made a very simple version of it, before I discovered yours: https://repl.it/
After using it, I have a question about the focal computation you implemented:
Why focal to view angle ratio correspond to 43.2mm wide sensor? No sensor of this size exists, and this value correspond to the diagonal of very common 24x36mm sensor. Why didn't you use 36mm instead 43.2mm?
The focal length can be used to give the magnification of a lens. But focal length specifications depend on the size of the sensor or film frame. Because many cameras use different sensor sizes, it would be difficult to compare different lenses for different cameras. The most common film size had a width of 35 mm and an aspect ratio of 1.5. This gives a diagonal of 43.2 mm. It is not true that no sensor of this size exists. Cameras using sensors of this size are called full frame sensor cameras and are very common in the professional sector. Each big camera manufacturer sells full frame sensor cameras.
To be able to compare the magnification of different lens/sensor combinations it is common practice to give a 35 mm equivalent focal length for each lens. By convention they have chosen to use the diagonal of a 35 mm film as the reference size (43.2 mm) but call it 35 mm equivalent anyway.
I could have used any sensor size in my calculations. But I chosed the same 35 mm equivalent like lenses use. This way you can enter the focal length published for an image, like it is stated in the EXIF data of an image, and my App displays the scene exactly like the camera/lens combination used. In this manner I can render images than can be superimposed onto a camera images and they match. On the other hand, I can find out the focal length used with my App by matching a camera image with the rendered image.
Presumably the data describing the scenes are saved in some sort of file format. What format, what tool do you author those in?
I am impressed that all this is done in javascript. Real impressive.
Roger, for the most flexibility the scenes are generated programmatically and not stored in a file. The whole App and the graphic subsystem is programmed by me in Javascript, using the canvas element for output. The graphic system implements 2D, 3D and perspective transformations and 2D and 3D clipping of arbitrary graphic elements like polygons, ellipses and splines, both outline and filled. It implements an easy to use interface. See 3DGraphX.