While out walking this afternoon I came across an ongoing Tesla Solar Roof in the neighborhood. Last week, the old roof had been taken off the house and an under covering which was branded Tesla had been installed so I kinda guessed that the Tesla Solar roofing was the next step.
True to fashion, a large work party, with truck and a sign advertising that it was a Tesla install appeared this AM. This afternoon I checked on their progress and the the panel installs have begun.
The individual tiles seemed to be delivered in palleted boxes and one is show below.
The picture I believe is showing 4 panels with the black triangular piece being the attachment point to the roof. I will have to investigate more it seems like the panels have significant space underneath them and they must be sturdy as the installer seem to be walking on them (with soft shoes) in the above picture.
Once the dust has settled I will speak with the homeowner to get some more information on the size of the install. I did a quick search and entering the Telsa Solar roof site and getting a price/recommendation it the following were some recommendations.
price after incentives etc
1 for 7770
$36,748, $31166 after incentive
3 for $17390
$45,552, $38867 after incentive
Pricing for two sizing of roof from Tesla site.
All in all, its looks like a pretty expensive setup, where there is marginal price difference between the 4.1KW panel setup and 10.2KW setup. Checking on Amazon, Renogy 30pcs 320W Monocrystalline Solar Panels would set you back $9700 which is close enough to 10KW of panels, so yes there is a lot of wiring, framing, installation and permitting costs but is it $20K? But its Tesla so it must be good for you…… I assume that Tesla will get the unit costs of the solar panels down with mass adoption.
Continuing the theme (part1, part2) of using GPS data to augment network graph data, this post will consider using the data for inclusion in the network graph but will also introduce taking the network graph and rendering this onto a geographical mapping visualization.
NetworkX graph package
In these discussions I have been using the Python NetworkX package to store a graph structure, nodes in this graph have city names as identifiers and edges created as a tuple or pair of city names.
import networkx as nx
import matplotlib.pyplot as plt
G = nx.Graph()
Cnames = ['Vancouver', 'Portland', 'San Francisco', 'Seattle',
'San Antonio', 'Dallas', 'Austin', 'Sevilla',
'Belfast', 'Sydney', 'Canberra', 'Tokyo']
The NetworkX package allows for easy graph creation and manipulation, allows attributes to be added to the graph structure and then the package graph algorithms ( shortest path, cliquing, graph analysis) and also graph traversal, neighbor query and also graph drawing using standard plotting packages such as matplotlib.
At this point, we now have a capability to generate a network graph that has named nodes and edges. The edges of the graph have a calculated distance derived from the GPS location of the nodes. An application of interest to me is calculating a idealized latency value for the network, i.e. what is the idealized one-way latency between location x and location y.
One-way latency for model
For the global network model that we have incrementally built, we started with cities as nodes of the network, using the GPS coordinates of these cites we create graph edges between the cities and calculated the great circle distance between these cities.
The next step in this model is to assume that a single fiber optic runs between these cities. This is unrealistic for geographical, technical, electro/optical and political reasons but is adequate for the model that I am building.
Light traveling in a fiber optic travels at a different speed than in free space. In free space light travels at 299792458 m/s (approximated as 3×10^5m/s). In an optical fiber, light travels slower than in free space, approximately 35% slower based on the refractive index of the fiber optic between its core and its cladding.
A fiber optic is composed of two different materials, the optic core and the optic cladding. As light leaves the optic core and enters the optic cladding it is refracted, at a certain angle (called the critical angle) the refraction results in the light being totally reflected back into the optic core. This results as shown in the figure above have the light ray ‘bouncing’ along the fiber optic from one end to another.
In the figure an angle θ is shown as the incident and relected angle, there is also a distance X marked which shows the linear distance along the fiber from one reflection point to another. While the light would traveling horizontally the distance of x when its in free space, because the light is inside the fiber optic it has to travel a longer distance donated as x/sin(θ), so lets assume θ = 45 degrees, sin(45 deg) = 0.707 so the light in the fiber optic has to travel x/0.707 which in this example is 1.41 times x, thus the earlier rule of thumb of light traveling 35% slower in an fiber optic.
So this 35% figure we can say a modeled speed of light is 2.121X10^5 m/s can be used.
nodes = 
for name in Cnames:
distance = great_circle(Cities['Dublin'],
G.edges['Dublin',name]['distance'] = round(distance,1)
distance * 2.121*10^^5
This latency calculation is assuming that the fiber optic is laid in a ‘straight line’ along the great circle between the two locations and does not include any optical regenerators, or intervening optical, or networking equipment.
In a previous post, I discussed calculating distances based on GPS coordinates, subsequently I came across a good python library that offers this functionality as well as a lot more. I decided that anybody that is interested in the distance calculation via GPS coordinates will probably appreciate everything else that the GeoPy library has to offer.
In addition to the distance calculation, GeoPy offers a client functionality to access several popular geocoding web servers.
From the Readme file of the package’s GitHub site:
geopy makes it easy for Python developers to locate the coordinates of addresses, cities, countries, and landmarks across the globe using third-party geocoders and other data sources.
Geocoding is provided by a number of different services, which are not affiliated with geopy. These services provide APIs, which anyone could implement, and geopy is a library which provides these implementations for many different services in a single package.
With the proliferation of GPS tagging in data sets, a useful calculation to that allows to derive the distance between two points is possible using the Haversine formula. The law of havershines is a more general formula within spherical trigonometry which relates to sides and angles or spherical triangles. On the surface of a sphere the ‘straight line’ connecting two points is actually the arc of a curve on the sphere surface. This curve arc is a arc along a great circle on the sphere and is mathematically called the spherical distance.
In the haversine calculation, the radius of the earth varies around 6356.752 km to 6378.137 km so choosing a value will result in a error that can not be less than 0.5%.
so using this formula, the distance of the great circle arc between two points on earths surface in python is:
from math import asin, cos, radians, sin, sqrt
r = 6370 #note this is for KM distance
a1,a2, = (radians(lat_a),radians(lon_a))
b1,b2 = (radians(lat_b),radians(lon_b))
delta_lat = b1-a1
delta_lon = b2-a2
n = abs(sin(delta_lat/2) **2
+ cos(a1)*cos(a2)*(sin(delta_lon/2) **2))
if n >1:
n = 1
I recently came across a compilation of columns that were originally published in Nature Methods. The compilation was subsequently published in 2015 as a document that covers data visualization and presentation of scientific data. It is a brief collection (40 pages) that provides a good set of suggestions when presenting and visualizing complex data sets.
Columns include general topics such as:
It also covers areas specific to Bioinformatics such as:
Representing the genome
Representing genomic structure
Visualizing biological data
As expected, Information content is high and presentation is professional and all within 40 pages, whats not to like, I learned a few techniques that will help with my data presentation and visualization moving forwards.